Apr 17 17:29:19 user nova-compute[71628]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 17 17:29:21 user nova-compute[71628]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71628) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 17:29:21 user nova-compute[71628]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71628) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 17:29:21 user nova-compute[71628]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71628) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 17:29:21 user nova-compute[71628]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 17 17:29:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.019s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:29:22 user nova-compute[71628]: INFO nova.virt.driver [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 17 17:29:22 user nova-compute[71628]: INFO nova.compute.provider_config [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Acquiring lock "singleton_lock" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Acquired lock "singleton_lock" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Releasing lock "singleton_lock" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Full set of CONF: {{(pid=71628) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ******************************************************************************** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Configuration options gathered from: {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ================================================================================ {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] allow_resize_to_same_host = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] arq_binding_timeout = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] backdoor_port = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] backdoor_socket = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] block_device_allocate_retries = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] block_device_allocate_retries_interval = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cert = self.pem {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute_driver = libvirt.LibvirtDriver {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute_monitors = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] config_dir = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] config_drive_format = iso9660 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] config_source = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] console_host = user {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] control_exchange = nova {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cpu_allocation_ratio = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] daemon = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] debug = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] default_access_ip_network_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] default_availability_zone = nova {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] default_ephemeral_format = ext4 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] default_schedule_zone = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] disk_allocation_ratio = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] enable_new_services = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] enabled_apis = ['osapi_compute'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] enabled_ssl_apis = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] flat_injected = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] force_config_drive = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] force_raw_images = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] graceful_shutdown_timeout = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] heal_instance_info_cache_interval = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] host = user {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] initial_disk_allocation_ratio = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] initial_ram_allocation_ratio = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_build_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_delete_interval = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_format = [instance: %(uuid)s] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_name_template = instance-%08x {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_usage_audit = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_usage_audit_period = month {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] instances_path = /opt/stack/data/nova/instances {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] internal_service_availability_zone = internal {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] key = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] live_migration_retry_count = 30 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_config_append = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_dir = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_options = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_rotate_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_rotate_interval_type = days {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] log_rotation_type = none {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] long_rpc_timeout = 1800 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_concurrent_builds = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_concurrent_live_migrations = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_concurrent_snapshots = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_local_block_devices = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_logfile_count = 30 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] max_logfile_size_mb = 200 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] maximum_instance_delete_attempts = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metadata_listen = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metadata_listen_port = 8775 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metadata_workers = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] migrate_max_retries = -1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] mkisofs_cmd = genisoimage {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] my_block_storage_ip = 10.0.0.210 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] my_ip = 10.0.0.210 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] network_allocate_retries = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] osapi_compute_listen = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] osapi_compute_listen_port = 8774 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] osapi_compute_unique_server_name_scope = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] osapi_compute_workers = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] password_length = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] periodic_enable = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] periodic_fuzzy_delay = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] pointer_model = ps2mouse {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] preallocate_images = none {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] publish_errors = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] pybasedir = /opt/stack/nova {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ram_allocation_ratio = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rate_limit_burst = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rate_limit_except_level = CRITICAL {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rate_limit_interval = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reboot_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reclaim_instance_interval = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] record = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reimage_timeout_per_gb = 20 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] report_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rescue_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reserved_host_cpus = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reserved_host_disk_mb = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reserved_host_memory_mb = 512 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] reserved_huge_pages = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] resize_confirm_window = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] resize_fs_using_block_device = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] resume_guests_state_on_host_boot = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rpc_response_timeout = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] run_external_periodic_tasks = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] running_deleted_instance_action = reap {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] running_deleted_instance_poll_interval = 1800 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] running_deleted_instance_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler_instance_sync_interval = 120 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_down_time = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] servicegroup_driver = db {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] shelved_offload_time = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] shelved_poll_interval = 3600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] shutdown_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] source_is_ipv6 = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ssl_only = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] state_path = /opt/stack/data/nova {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] sync_power_state_interval = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] sync_power_state_pool_size = 1000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] syslog_log_facility = LOG_USER {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] tempdir = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] timeout_nbd = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] transport_url = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] update_resources_interval = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_cow_images = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_eventlog = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_journal = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_json = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_rootwrap_daemon = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_stderr = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] use_syslog = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vcpu_pin_set = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plugging_is_fatal = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plugging_timeout = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] virt_mkfs = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] volume_usage_poll_interval = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] watch_log_file = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] web = /usr/share/spice-html5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_concurrency.disable_process_locking = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.auth_strategy = keystone {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.compute_link_prefix = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.dhcp_domain = novalocal {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.enable_instance_password = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.glance_link_prefix = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.instance_list_per_project_cells = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.list_records_by_skipping_down_cells = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.local_metadata_per_cell = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.max_limit = 1000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.metadata_cache_expiration = 15 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.neutron_default_tenant_id = default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.use_forwarded_for = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.use_neutron_default_nets = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_dynamic_targets = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_jsonfile_path = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.backend = dogpile.cache.memcached {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.backend_argument = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.config_prefix = cache.oslo {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.dead_timeout = 60.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.debug_cache_backend = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.enable_retry_client = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.enable_socket_keepalive = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.enabled = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.expiration_time = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.hashclient_retry_attempts = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.hashclient_retry_delay = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_dead_retry = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_password = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_pool_maxsize = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_sasl_enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_socket_timeout = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.memcache_username = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.proxies = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.retry_attempts = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.retry_delay = 0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.socket_keepalive_count = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.socket_keepalive_idle = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.socket_keepalive_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.tls_allowed_ciphers = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.tls_cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.tls_certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.tls_enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cache.tls_keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.auth_type = password {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.catalog_info = volumev3::publicURL {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.cross_az_attach = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.debug = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.endpoint_template = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.http_retries = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.os_region_name = RegionOne {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cinder.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.cpu_dedicated_set = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.cpu_shared_set = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.image_type_exclude_list = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.max_concurrent_disk_ops = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.max_disk_devices_to_attach = -1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.resource_provider_association_refresh = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.shutdown_retry_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] conductor.workers = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] console.allowed_origins = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] console.ssl_ciphers = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] console.ssl_minimum_version = default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] consoleauth.token_ttl = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.service_type = accelerator {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] cyborg.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.backend = sqlalchemy {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.connection = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.connection_debug = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.connection_parameters = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.connection_recycle_time = 3600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.connection_trace = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.db_inc_retry_interval = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.db_max_retries = 20 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.db_max_retry_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.db_retry_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.max_overflow = 50 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.max_pool_size = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.max_retries = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.mysql_enable_ndb = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.mysql_wsrep_sync_wait = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.pool_timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.retry_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.slave_connection = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] database.sqlite_synchronous = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.backend = sqlalchemy {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.connection = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.connection_debug = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.connection_parameters = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.connection_recycle_time = 3600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.connection_trace = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.db_inc_retry_interval = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.db_max_retries = 20 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.db_max_retry_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.db_retry_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.max_overflow = 50 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.max_pool_size = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.max_retries = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.mysql_enable_ndb = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.pool_timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.retry_interval = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.slave_connection = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] api_database.sqlite_synchronous = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] devices.enabled_mdev_types = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ephemeral_storage_encryption.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.api_servers = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.debug = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.default_trusted_certificate_ids = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.enable_certificate_validation = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.enable_rbd_download = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.num_retries = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.rbd_ceph_conf = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.rbd_connect_timeout = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.rbd_pool = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.rbd_user = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.service_type = image {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.verify_glance_signatures = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] glance.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] guestfs.debug = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.config_drive_cdrom = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.config_drive_inject_password = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.enable_instance_metrics_collection = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.enable_remotefx = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.instances_path_share = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.iscsi_initiator_list = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.limit_cpu_features = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.power_state_check_timeframe = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.use_multipath_io = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.volume_attach_retry_count = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.vswitch_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] mks.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.manager_interval = 2400 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.precache_concurrency = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.remove_unused_base_images = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] image_cache.subdirectory_name = _base {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.api_max_retries = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.api_retry_interval = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.auth_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.partition_key = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.peer_list = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.serial_console_state_timeout = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.service_type = baremetal {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ironic.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] key_manager.fixed_key = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.barbican_api_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.barbican_endpoint = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.barbican_endpoint_type = public {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.barbican_region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.number_of_retries = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.retry_delay = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.send_service_user_token = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.verify_ssl = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican.verify_ssl_path = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.auth_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] barbican_service_user.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.approle_role_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.approle_secret_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.kv_mountpoint = secret {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.kv_version = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.namespace = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.root_token_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.ssl_ca_crt_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.use_ssl = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.service_type = identity {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] keystone.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.connection_uri = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_mode = custom {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_model_extra_flags = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: WARNING oslo_config.cfg [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_power_governor_high = performance {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_power_governor_low = powersave {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_power_management = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.device_detach_attempts = 8 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.device_detach_timeout = 20 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.disk_cachemodes = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.disk_prefix = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.enabled_perf_events = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.file_backed_memory = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.gid_maps = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.hw_disk_discard = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.hw_machine_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_rbd_ceph_conf = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_rbd_glance_store_name = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_rbd_pool = rbd {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_type = default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.images_volume_group = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.inject_key = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.inject_partition = -2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.inject_password = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.iscsi_iface = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.iser_use_multipath = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_bandwidth = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_downtime = 500 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_inbound_addr = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_permit_post_copy = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_scheme = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_timeout_action = abort {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_tunnelled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: WARNING oslo_config.cfg [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 17 17:29:22 user nova-compute[71628]: live_migration_uri is deprecated for removal in favor of two other options that Apr 17 17:29:22 user nova-compute[71628]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 17 17:29:22 user nova-compute[71628]: and ``live_migration_inbound_addr`` respectively. Apr 17 17:29:22 user nova-compute[71628]: ). Its value may be silently ignored in the future. Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.live_migration_with_native_tls = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.max_queues = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.nfs_mount_options = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_iser_scan_tries = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_memory_encrypted_guests = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_pcie_ports = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.num_volume_scan_tries = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.pmem_namespaces = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.quobyte_client_cfg = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rbd_connect_timeout = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rbd_secret_uuid = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rbd_user = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.remote_filesystem_transport = ssh {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rescue_image_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rescue_kernel_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rescue_ramdisk_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.rx_queue_size = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.smbfs_mount_options = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.snapshot_compression = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.snapshot_image_format = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.sparse_logical_volumes = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.swtpm_enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.swtpm_group = tss {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.swtpm_user = tss {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.sysinfo_serial = unique {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.tx_queue_size = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.uid_maps = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.use_virtio_for_bridges = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.virt_type = kvm {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.volume_clear = zero {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.volume_clear_size = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.volume_use_multipath = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_cache_path = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_mount_group = qemu {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_mount_opts = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.vzstorage_mount_user = stack {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.auth_type = password {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.default_floating_pool = public {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.extension_sync_interval = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.http_retries = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.ovs_bridge = br-int {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.physnets = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.region_name = RegionOne {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.service_metadata_proxy = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.service_type = network {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] neutron.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] notifications.bdms_in_notifications = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] notifications.default_level = INFO {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] notifications.notification_format = unversioned {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] notifications.notify_on_state_change = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] pci.alias = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] pci.device_spec = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] pci.report_in_placement = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.auth_type = password {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.default_domain_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.default_domain_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.domain_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.domain_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.password = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.project_domain_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.project_domain_name = Default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.project_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.project_name = service {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.region_name = RegionOne {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.service_type = placement {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.system_scope = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.trust_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.user_domain_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.user_domain_name = Default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.user_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.username = placement {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] placement.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.cores = 20 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.count_usage_from_placement = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.injected_file_content_bytes = 10240 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.injected_file_path_length = 255 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.injected_files = 5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.instances = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.key_pairs = 100 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.metadata_items = 128 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.ram = 51200 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.recheck_quota = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.server_group_members = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] quota.server_groups = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rdp.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.image_metadata_prefilter = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.max_attempts = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.max_placement_results = 1000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.query_placement_for_availability_zone = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.query_placement_for_image_type_support = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] scheduler.workers = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.host_subset_size = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.isolated_hosts = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.isolated_images = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.pci_in_placement = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.track_instance_changes = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metrics.required = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metrics.weight_multiplier = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] metrics.weight_setting = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.port_range = 10000:20000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] serial_console.serialproxy_port = 6083 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.auth_type = password {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.send_service_user_token = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] service_user.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.agent_enabled = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.html5proxy_port = 6082 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.image_compression = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.jpeg_compression = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.playback_compression = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.server_listen = 127.0.0.1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.streaming_mode = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] spice.zlib_compression = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] upgrade_levels.baseapi = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] upgrade_levels.cert = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] upgrade_levels.compute = auto {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] upgrade_levels.conductor = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] upgrade_levels.scheduler = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.auth_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vendordata_dynamic_auth.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.api_retry_count = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.ca_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.cache_prefix = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.cluster_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.connection_pool_size = 10 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.console_delay_seconds = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.datastore_regex = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.host_ip = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.host_password = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.host_port = 443 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.host_username = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.integration_bridge = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.maximum_objects = 100 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.pbm_default_policy = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.pbm_enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.pbm_wsdl_location = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.serial_port_proxy_uri = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.serial_port_service_uri = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.task_poll_interval = 0.5 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.use_linked_clone = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.vnc_keymap = en-us {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.vnc_port = 5900 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vmware.vnc_port_total = 10000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.auth_schemes = ['none'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.enabled = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.novncproxy_port = 6080 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.server_listen = 0.0.0.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.vencrypt_ca_certs = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.vencrypt_client_cert = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vnc.vencrypt_client_key = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.disable_rootwrap = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.enable_numa_live_migration = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.libvirt_disable_apic = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.client_socket_timeout = 900 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.default_pool_size = 1000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.keep_alive = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.max_header_line = 16384 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.secure_proxy_ssl_header = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.ssl_ca_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.ssl_cert_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.ssl_key_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.tcp_keepidle = 600 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] zvm.ca_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] zvm.cloud_connector_url = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] zvm.reachable_timeout = 300 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.enforce_new_defaults = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.enforce_scope = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.policy_default_rule = default {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.policy_file = policy.yaml {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.connection_string = messaging:// {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.enabled = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.es_doc_type = notification {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.es_scroll_size = 10000 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.es_scroll_time = 2m {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.filter_error_trace = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.hmac_keys = SECRET_KEY {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.sentinel_service_name = mymaster {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.socket_timeout = 0.1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] profiler.trace_sqlalchemy = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] remote_debug.host = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] remote_debug.port = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_rabbit.ssl_version = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_notifications.retry = -1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_messaging_notifications.transport_url = **** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.auth_section = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.auth_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.cafile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.certfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.collect_timing = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.connect_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.connect_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.endpoint_id = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.endpoint_override = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.insecure = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.keyfile = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.max_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.min_version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.region_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.service_name = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.service_type = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.split_loggers = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.status_code_retries = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.status_code_retry_delay = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.timeout = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.valid_interfaces = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_limit.version = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_reports.file_event_handler = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] oslo_reports.log_dir = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.group = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] vif_plug_ovs_privileged.user = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.flat_interface = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.isolate_vif = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.ovsdb_interface = native {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_vif_ovs.per_port_bridge = False {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] os_brick.lock_path = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.capabilities = [21] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.group = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.helper_command = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] privsep_osbrick.user = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.group = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.helper_command = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] nova_sys_admin.user = None {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG oslo_service.service [None req-5e81d397-86b7-4181-b50c-4c6c82dd992b None None] ******************************************************************************** {{(pid=71628) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 17 17:29:22 user nova-compute[71628]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Starting native event thread {{(pid=71628) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Starting green dispatch thread {{(pid=71628) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Starting connection event dispatch thread {{(pid=71628) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Connecting to libvirt: qemu:///system {{(pid=71628) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Registering for lifecycle events {{(pid=71628) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Registering for connection events: {{(pid=71628) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 17 17:29:22 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Connection event '1' reason 'None' Apr 17 17:29:22 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 17 17:29:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.volume.mount [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Initialising _HostMountState generation 0 {{(pid=71628) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 17 17:29:29 user nova-compute[71628]: INFO nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host capabilities Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: x86_64 Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tcp Apr 17 17:29:29 user nova-compute[71628]: rdma Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 8189224 Apr 17 17:29:29 user nova-compute[71628]: 2047306 Apr 17 17:29:29 user nova-compute[71628]: 0 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 8218764 Apr 17 17:29:29 user nova-compute[71628]: 2054691 Apr 17 17:29:29 user nova-compute[71628]: 0 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: apparmor Apr 17 17:29:29 user nova-compute[71628]: 0 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: dac Apr 17 17:29:29 user nova-compute[71628]: 0 Apr 17 17:29:29 user nova-compute[71628]: +64055:+108 Apr 17 17:29:29 user nova-compute[71628]: +64055:+108 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-alpha Apr 17 17:29:29 user nova-compute[71628]: clipper Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-arm Apr 17 17:29:29 user nova-compute[71628]: integratorcp Apr 17 17:29:29 user nova-compute[71628]: ast2600-evb Apr 17 17:29:29 user nova-compute[71628]: borzoi Apr 17 17:29:29 user nova-compute[71628]: spitz Apr 17 17:29:29 user nova-compute[71628]: virt-2.7 Apr 17 17:29:29 user nova-compute[71628]: nuri Apr 17 17:29:29 user nova-compute[71628]: mcimx7d-sabre Apr 17 17:29:29 user nova-compute[71628]: romulus-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-3.0 Apr 17 17:29:29 user nova-compute[71628]: virt-5.0 Apr 17 17:29:29 user nova-compute[71628]: npcm750-evb Apr 17 17:29:29 user nova-compute[71628]: virt-2.10 Apr 17 17:29:29 user nova-compute[71628]: rainier-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an547 Apr 17 17:29:29 user nova-compute[71628]: musca-b1 Apr 17 17:29:29 user nova-compute[71628]: realview-pbx-a9 Apr 17 17:29:29 user nova-compute[71628]: versatileab Apr 17 17:29:29 user nova-compute[71628]: kzm Apr 17 17:29:29 user nova-compute[71628]: virt-2.8 Apr 17 17:29:29 user nova-compute[71628]: musca-a Apr 17 17:29:29 user nova-compute[71628]: virt-3.1 Apr 17 17:29:29 user nova-compute[71628]: mcimx6ul-evk Apr 17 17:29:29 user nova-compute[71628]: virt-5.1 Apr 17 17:29:29 user nova-compute[71628]: smdkc210 Apr 17 17:29:29 user nova-compute[71628]: sx1 Apr 17 17:29:29 user nova-compute[71628]: virt-2.11 Apr 17 17:29:29 user nova-compute[71628]: imx25-pdk Apr 17 17:29:29 user nova-compute[71628]: stm32vldiscovery Apr 17 17:29:29 user nova-compute[71628]: virt-2.9 Apr 17 17:29:29 user nova-compute[71628]: orangepi-pc Apr 17 17:29:29 user nova-compute[71628]: quanta-q71l-bmc Apr 17 17:29:29 user nova-compute[71628]: z2 Apr 17 17:29:29 user nova-compute[71628]: virt-5.2 Apr 17 17:29:29 user nova-compute[71628]: xilinx-zynq-a9 Apr 17 17:29:29 user nova-compute[71628]: tosa Apr 17 17:29:29 user nova-compute[71628]: mps2-an500 Apr 17 17:29:29 user nova-compute[71628]: virt-2.12 Apr 17 17:29:29 user nova-compute[71628]: mps2-an521 Apr 17 17:29:29 user nova-compute[71628]: sabrelite Apr 17 17:29:29 user nova-compute[71628]: mps2-an511 Apr 17 17:29:29 user nova-compute[71628]: canon-a1100 Apr 17 17:29:29 user nova-compute[71628]: realview-eb Apr 17 17:29:29 user nova-compute[71628]: quanta-gbs-bmc Apr 17 17:29:29 user nova-compute[71628]: emcraft-sf2 Apr 17 17:29:29 user nova-compute[71628]: realview-pb-a8 Apr 17 17:29:29 user nova-compute[71628]: virt-4.0 Apr 17 17:29:29 user nova-compute[71628]: raspi1ap Apr 17 17:29:29 user nova-compute[71628]: palmetto-bmc Apr 17 17:29:29 user nova-compute[71628]: sx1-v1 Apr 17 17:29:29 user nova-compute[71628]: n810 Apr 17 17:29:29 user nova-compute[71628]: g220a-bmc Apr 17 17:29:29 user nova-compute[71628]: n800 Apr 17 17:29:29 user nova-compute[71628]: tacoma-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.1 Apr 17 17:29:29 user nova-compute[71628]: quanta-gsj Apr 17 17:29:29 user nova-compute[71628]: versatilepb Apr 17 17:29:29 user nova-compute[71628]: terrier Apr 17 17:29:29 user nova-compute[71628]: mainstone Apr 17 17:29:29 user nova-compute[71628]: realview-eb-mpcore Apr 17 17:29:29 user nova-compute[71628]: supermicrox11-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.2 Apr 17 17:29:29 user nova-compute[71628]: witherspoon-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an524 Apr 17 17:29:29 user nova-compute[71628]: swift-bmc Apr 17 17:29:29 user nova-compute[71628]: kudo-bmc Apr 17 17:29:29 user nova-compute[71628]: vexpress-a9 Apr 17 17:29:29 user nova-compute[71628]: midway Apr 17 17:29:29 user nova-compute[71628]: musicpal Apr 17 17:29:29 user nova-compute[71628]: lm3s811evb Apr 17 17:29:29 user nova-compute[71628]: lm3s6965evb Apr 17 17:29:29 user nova-compute[71628]: microbit Apr 17 17:29:29 user nova-compute[71628]: mps2-an505 Apr 17 17:29:29 user nova-compute[71628]: mps2-an385 Apr 17 17:29:29 user nova-compute[71628]: virt-6.0 Apr 17 17:29:29 user nova-compute[71628]: cubieboard Apr 17 17:29:29 user nova-compute[71628]: verdex Apr 17 17:29:29 user nova-compute[71628]: netduino2 Apr 17 17:29:29 user nova-compute[71628]: mps2-an386 Apr 17 17:29:29 user nova-compute[71628]: virt-6.1 Apr 17 17:29:29 user nova-compute[71628]: raspi2b Apr 17 17:29:29 user nova-compute[71628]: vexpress-a15 Apr 17 17:29:29 user nova-compute[71628]: fuji-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-6.2 Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: sonorapass-bmc Apr 17 17:29:29 user nova-compute[71628]: cheetah Apr 17 17:29:29 user nova-compute[71628]: virt-2.6 Apr 17 17:29:29 user nova-compute[71628]: ast2500-evb Apr 17 17:29:29 user nova-compute[71628]: highbank Apr 17 17:29:29 user nova-compute[71628]: akita Apr 17 17:29:29 user nova-compute[71628]: connex Apr 17 17:29:29 user nova-compute[71628]: netduinoplus2 Apr 17 17:29:29 user nova-compute[71628]: collie Apr 17 17:29:29 user nova-compute[71628]: raspi0 Apr 17 17:29:29 user nova-compute[71628]: fp5280g2-bmc Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-arm Apr 17 17:29:29 user nova-compute[71628]: integratorcp Apr 17 17:29:29 user nova-compute[71628]: ast2600-evb Apr 17 17:29:29 user nova-compute[71628]: borzoi Apr 17 17:29:29 user nova-compute[71628]: spitz Apr 17 17:29:29 user nova-compute[71628]: virt-2.7 Apr 17 17:29:29 user nova-compute[71628]: nuri Apr 17 17:29:29 user nova-compute[71628]: mcimx7d-sabre Apr 17 17:29:29 user nova-compute[71628]: romulus-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-3.0 Apr 17 17:29:29 user nova-compute[71628]: virt-5.0 Apr 17 17:29:29 user nova-compute[71628]: npcm750-evb Apr 17 17:29:29 user nova-compute[71628]: virt-2.10 Apr 17 17:29:29 user nova-compute[71628]: rainier-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an547 Apr 17 17:29:29 user nova-compute[71628]: musca-b1 Apr 17 17:29:29 user nova-compute[71628]: realview-pbx-a9 Apr 17 17:29:29 user nova-compute[71628]: versatileab Apr 17 17:29:29 user nova-compute[71628]: kzm Apr 17 17:29:29 user nova-compute[71628]: virt-2.8 Apr 17 17:29:29 user nova-compute[71628]: musca-a Apr 17 17:29:29 user nova-compute[71628]: virt-3.1 Apr 17 17:29:29 user nova-compute[71628]: mcimx6ul-evk Apr 17 17:29:29 user nova-compute[71628]: virt-5.1 Apr 17 17:29:29 user nova-compute[71628]: smdkc210 Apr 17 17:29:29 user nova-compute[71628]: sx1 Apr 17 17:29:29 user nova-compute[71628]: virt-2.11 Apr 17 17:29:29 user nova-compute[71628]: imx25-pdk Apr 17 17:29:29 user nova-compute[71628]: stm32vldiscovery Apr 17 17:29:29 user nova-compute[71628]: virt-2.9 Apr 17 17:29:29 user nova-compute[71628]: orangepi-pc Apr 17 17:29:29 user nova-compute[71628]: quanta-q71l-bmc Apr 17 17:29:29 user nova-compute[71628]: z2 Apr 17 17:29:29 user nova-compute[71628]: virt-5.2 Apr 17 17:29:29 user nova-compute[71628]: xilinx-zynq-a9 Apr 17 17:29:29 user nova-compute[71628]: tosa Apr 17 17:29:29 user nova-compute[71628]: mps2-an500 Apr 17 17:29:29 user nova-compute[71628]: virt-2.12 Apr 17 17:29:29 user nova-compute[71628]: mps2-an521 Apr 17 17:29:29 user nova-compute[71628]: sabrelite Apr 17 17:29:29 user nova-compute[71628]: mps2-an511 Apr 17 17:29:29 user nova-compute[71628]: canon-a1100 Apr 17 17:29:29 user nova-compute[71628]: realview-eb Apr 17 17:29:29 user nova-compute[71628]: quanta-gbs-bmc Apr 17 17:29:29 user nova-compute[71628]: emcraft-sf2 Apr 17 17:29:29 user nova-compute[71628]: realview-pb-a8 Apr 17 17:29:29 user nova-compute[71628]: virt-4.0 Apr 17 17:29:29 user nova-compute[71628]: raspi1ap Apr 17 17:29:29 user nova-compute[71628]: palmetto-bmc Apr 17 17:29:29 user nova-compute[71628]: sx1-v1 Apr 17 17:29:29 user nova-compute[71628]: n810 Apr 17 17:29:29 user nova-compute[71628]: g220a-bmc Apr 17 17:29:29 user nova-compute[71628]: n800 Apr 17 17:29:29 user nova-compute[71628]: tacoma-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.1 Apr 17 17:29:29 user nova-compute[71628]: quanta-gsj Apr 17 17:29:29 user nova-compute[71628]: versatilepb Apr 17 17:29:29 user nova-compute[71628]: terrier Apr 17 17:29:29 user nova-compute[71628]: mainstone Apr 17 17:29:29 user nova-compute[71628]: realview-eb-mpcore Apr 17 17:29:29 user nova-compute[71628]: supermicrox11-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.2 Apr 17 17:29:29 user nova-compute[71628]: witherspoon-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an524 Apr 17 17:29:29 user nova-compute[71628]: swift-bmc Apr 17 17:29:29 user nova-compute[71628]: kudo-bmc Apr 17 17:29:29 user nova-compute[71628]: vexpress-a9 Apr 17 17:29:29 user nova-compute[71628]: midway Apr 17 17:29:29 user nova-compute[71628]: musicpal Apr 17 17:29:29 user nova-compute[71628]: lm3s811evb Apr 17 17:29:29 user nova-compute[71628]: lm3s6965evb Apr 17 17:29:29 user nova-compute[71628]: microbit Apr 17 17:29:29 user nova-compute[71628]: mps2-an505 Apr 17 17:29:29 user nova-compute[71628]: mps2-an385 Apr 17 17:29:29 user nova-compute[71628]: virt-6.0 Apr 17 17:29:29 user nova-compute[71628]: cubieboard Apr 17 17:29:29 user nova-compute[71628]: verdex Apr 17 17:29:29 user nova-compute[71628]: netduino2 Apr 17 17:29:29 user nova-compute[71628]: mps2-an386 Apr 17 17:29:29 user nova-compute[71628]: virt-6.1 Apr 17 17:29:29 user nova-compute[71628]: raspi2b Apr 17 17:29:29 user nova-compute[71628]: vexpress-a15 Apr 17 17:29:29 user nova-compute[71628]: fuji-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-6.2 Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: sonorapass-bmc Apr 17 17:29:29 user nova-compute[71628]: cheetah Apr 17 17:29:29 user nova-compute[71628]: virt-2.6 Apr 17 17:29:29 user nova-compute[71628]: ast2500-evb Apr 17 17:29:29 user nova-compute[71628]: highbank Apr 17 17:29:29 user nova-compute[71628]: akita Apr 17 17:29:29 user nova-compute[71628]: connex Apr 17 17:29:29 user nova-compute[71628]: netduinoplus2 Apr 17 17:29:29 user nova-compute[71628]: collie Apr 17 17:29:29 user nova-compute[71628]: raspi0 Apr 17 17:29:29 user nova-compute[71628]: fp5280g2-bmc Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-aarch64 Apr 17 17:29:29 user nova-compute[71628]: integratorcp Apr 17 17:29:29 user nova-compute[71628]: ast2600-evb Apr 17 17:29:29 user nova-compute[71628]: borzoi Apr 17 17:29:29 user nova-compute[71628]: spitz Apr 17 17:29:29 user nova-compute[71628]: virt-2.7 Apr 17 17:29:29 user nova-compute[71628]: nuri Apr 17 17:29:29 user nova-compute[71628]: mcimx7d-sabre Apr 17 17:29:29 user nova-compute[71628]: romulus-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-3.0 Apr 17 17:29:29 user nova-compute[71628]: virt-5.0 Apr 17 17:29:29 user nova-compute[71628]: npcm750-evb Apr 17 17:29:29 user nova-compute[71628]: virt-2.10 Apr 17 17:29:29 user nova-compute[71628]: rainier-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an547 Apr 17 17:29:29 user nova-compute[71628]: virt-2.8 Apr 17 17:29:29 user nova-compute[71628]: musca-b1 Apr 17 17:29:29 user nova-compute[71628]: realview-pbx-a9 Apr 17 17:29:29 user nova-compute[71628]: versatileab Apr 17 17:29:29 user nova-compute[71628]: kzm Apr 17 17:29:29 user nova-compute[71628]: musca-a Apr 17 17:29:29 user nova-compute[71628]: virt-3.1 Apr 17 17:29:29 user nova-compute[71628]: mcimx6ul-evk Apr 17 17:29:29 user nova-compute[71628]: virt-5.1 Apr 17 17:29:29 user nova-compute[71628]: smdkc210 Apr 17 17:29:29 user nova-compute[71628]: sx1 Apr 17 17:29:29 user nova-compute[71628]: virt-2.11 Apr 17 17:29:29 user nova-compute[71628]: imx25-pdk Apr 17 17:29:29 user nova-compute[71628]: stm32vldiscovery Apr 17 17:29:29 user nova-compute[71628]: virt-2.9 Apr 17 17:29:29 user nova-compute[71628]: orangepi-pc Apr 17 17:29:29 user nova-compute[71628]: quanta-q71l-bmc Apr 17 17:29:29 user nova-compute[71628]: z2 Apr 17 17:29:29 user nova-compute[71628]: virt-5.2 Apr 17 17:29:29 user nova-compute[71628]: xilinx-zynq-a9 Apr 17 17:29:29 user nova-compute[71628]: xlnx-zcu102 Apr 17 17:29:29 user nova-compute[71628]: tosa Apr 17 17:29:29 user nova-compute[71628]: mps2-an500 Apr 17 17:29:29 user nova-compute[71628]: virt-2.12 Apr 17 17:29:29 user nova-compute[71628]: mps2-an521 Apr 17 17:29:29 user nova-compute[71628]: sabrelite Apr 17 17:29:29 user nova-compute[71628]: mps2-an511 Apr 17 17:29:29 user nova-compute[71628]: canon-a1100 Apr 17 17:29:29 user nova-compute[71628]: realview-eb Apr 17 17:29:29 user nova-compute[71628]: quanta-gbs-bmc Apr 17 17:29:29 user nova-compute[71628]: emcraft-sf2 Apr 17 17:29:29 user nova-compute[71628]: realview-pb-a8 Apr 17 17:29:29 user nova-compute[71628]: sbsa-ref Apr 17 17:29:29 user nova-compute[71628]: virt-4.0 Apr 17 17:29:29 user nova-compute[71628]: raspi1ap Apr 17 17:29:29 user nova-compute[71628]: palmetto-bmc Apr 17 17:29:29 user nova-compute[71628]: sx1-v1 Apr 17 17:29:29 user nova-compute[71628]: n810 Apr 17 17:29:29 user nova-compute[71628]: g220a-bmc Apr 17 17:29:29 user nova-compute[71628]: n800 Apr 17 17:29:29 user nova-compute[71628]: tacoma-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.1 Apr 17 17:29:29 user nova-compute[71628]: quanta-gsj Apr 17 17:29:29 user nova-compute[71628]: versatilepb Apr 17 17:29:29 user nova-compute[71628]: terrier Apr 17 17:29:29 user nova-compute[71628]: mainstone Apr 17 17:29:29 user nova-compute[71628]: realview-eb-mpcore Apr 17 17:29:29 user nova-compute[71628]: supermicrox11-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-4.2 Apr 17 17:29:29 user nova-compute[71628]: witherspoon-bmc Apr 17 17:29:29 user nova-compute[71628]: mps3-an524 Apr 17 17:29:29 user nova-compute[71628]: swift-bmc Apr 17 17:29:29 user nova-compute[71628]: kudo-bmc Apr 17 17:29:29 user nova-compute[71628]: vexpress-a9 Apr 17 17:29:29 user nova-compute[71628]: midway Apr 17 17:29:29 user nova-compute[71628]: musicpal Apr 17 17:29:29 user nova-compute[71628]: lm3s811evb Apr 17 17:29:29 user nova-compute[71628]: lm3s6965evb Apr 17 17:29:29 user nova-compute[71628]: microbit Apr 17 17:29:29 user nova-compute[71628]: mps2-an505 Apr 17 17:29:29 user nova-compute[71628]: mps2-an385 Apr 17 17:29:29 user nova-compute[71628]: virt-6.0 Apr 17 17:29:29 user nova-compute[71628]: raspi3ap Apr 17 17:29:29 user nova-compute[71628]: cubieboard Apr 17 17:29:29 user nova-compute[71628]: verdex Apr 17 17:29:29 user nova-compute[71628]: netduino2 Apr 17 17:29:29 user nova-compute[71628]: xlnx-versal-virt Apr 17 17:29:29 user nova-compute[71628]: mps2-an386 Apr 17 17:29:29 user nova-compute[71628]: virt-6.1 Apr 17 17:29:29 user nova-compute[71628]: raspi3b Apr 17 17:29:29 user nova-compute[71628]: raspi2b Apr 17 17:29:29 user nova-compute[71628]: vexpress-a15 Apr 17 17:29:29 user nova-compute[71628]: fuji-bmc Apr 17 17:29:29 user nova-compute[71628]: virt-6.2 Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: sonorapass-bmc Apr 17 17:29:29 user nova-compute[71628]: cheetah Apr 17 17:29:29 user nova-compute[71628]: virt-2.6 Apr 17 17:29:29 user nova-compute[71628]: ast2500-evb Apr 17 17:29:29 user nova-compute[71628]: highbank Apr 17 17:29:29 user nova-compute[71628]: akita Apr 17 17:29:29 user nova-compute[71628]: connex Apr 17 17:29:29 user nova-compute[71628]: netduinoplus2 Apr 17 17:29:29 user nova-compute[71628]: collie Apr 17 17:29:29 user nova-compute[71628]: raspi0 Apr 17 17:29:29 user nova-compute[71628]: fp5280g2-bmc Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-cris Apr 17 17:29:29 user nova-compute[71628]: axis-dev88 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-i386 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy Apr 17 17:29:29 user nova-compute[71628]: ubuntu Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-impish-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.12 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-xenial Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.2 Apr 17 17:29:29 user nova-compute[71628]: pc Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.5 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-focal Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-hirsute Apr 17 17:29:29 user nova-compute[71628]: pc-q35-xenial Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.5 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-eoan-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-zesty Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-disco-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-groovy Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-groovy Apr 17 17:29:29 user nova-compute[71628]: pc-q35-artful Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-trusty Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-eoan-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-focal-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-bionic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-artful Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-yakkety Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.4 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-cosmic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.10 Apr 17 17:29:29 user nova-compute[71628]: x-remote Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.9 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.11 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-3.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy Apr 17 17:29:29 user nova-compute[71628]: ubuntu-q35 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.4 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-eoan Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.9 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-bionic-hpb Apr 17 17:29:29 user nova-compute[71628]: isapc Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.4 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-cosmic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.6 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-3.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-bionic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-disco-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-cosmic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.12 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-bionic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-groovy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-disco Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-cosmic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-wily Apr 17 17:29:29 user nova-compute[71628]: pc-q35-impish Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-impish Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.6 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-impish-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-hirsute Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.0.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-hirsute-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.6 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.8 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.10 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-3.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-zesty Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-focal Apr 17 17:29:29 user nova-compute[71628]: microvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.3 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-focal-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-disco Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-groovy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-hirsute-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.2 Apr 17 17:29:29 user nova-compute[71628]: q35 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.8 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-eoan Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.5 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-3.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-yakkety Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.11 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-m68k Apr 17 17:29:29 user nova-compute[71628]: mcf5208evb Apr 17 17:29:29 user nova-compute[71628]: an5206 Apr 17 17:29:29 user nova-compute[71628]: virt-6.0 Apr 17 17:29:29 user nova-compute[71628]: q800 Apr 17 17:29:29 user nova-compute[71628]: virt-6.2 Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: next-cube Apr 17 17:29:29 user nova-compute[71628]: virt-6.1 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-microblaze Apr 17 17:29:29 user nova-compute[71628]: petalogix-s3adsp1800 Apr 17 17:29:29 user nova-compute[71628]: petalogix-ml605 Apr 17 17:29:29 user nova-compute[71628]: xlnx-zynqmp-pmu Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-microblazeel Apr 17 17:29:29 user nova-compute[71628]: petalogix-s3adsp1800 Apr 17 17:29:29 user nova-compute[71628]: petalogix-ml605 Apr 17 17:29:29 user nova-compute[71628]: xlnx-zynqmp-pmu Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-mips Apr 17 17:29:29 user nova-compute[71628]: malta Apr 17 17:29:29 user nova-compute[71628]: mipssim Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-mipsel Apr 17 17:29:29 user nova-compute[71628]: malta Apr 17 17:29:29 user nova-compute[71628]: mipssim Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-mips64 Apr 17 17:29:29 user nova-compute[71628]: malta Apr 17 17:29:29 user nova-compute[71628]: mipssim Apr 17 17:29:29 user nova-compute[71628]: pica61 Apr 17 17:29:29 user nova-compute[71628]: magnum Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-mips64el Apr 17 17:29:29 user nova-compute[71628]: malta Apr 17 17:29:29 user nova-compute[71628]: loongson3-virt Apr 17 17:29:29 user nova-compute[71628]: mipssim Apr 17 17:29:29 user nova-compute[71628]: pica61 Apr 17 17:29:29 user nova-compute[71628]: magnum Apr 17 17:29:29 user nova-compute[71628]: boston Apr 17 17:29:29 user nova-compute[71628]: fuloong2e Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-ppc Apr 17 17:29:29 user nova-compute[71628]: g3beige Apr 17 17:29:29 user nova-compute[71628]: virtex-ml507 Apr 17 17:29:29 user nova-compute[71628]: mac99 Apr 17 17:29:29 user nova-compute[71628]: ppce500 Apr 17 17:29:29 user nova-compute[71628]: pegasos2 Apr 17 17:29:29 user nova-compute[71628]: sam460ex Apr 17 17:29:29 user nova-compute[71628]: bamboo Apr 17 17:29:29 user nova-compute[71628]: 40p Apr 17 17:29:29 user nova-compute[71628]: ref405ep Apr 17 17:29:29 user nova-compute[71628]: mpc8544ds Apr 17 17:29:29 user nova-compute[71628]: taihu Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-ppc64 Apr 17 17:29:29 user nova-compute[71628]: pseries-jammy Apr 17 17:29:29 user nova-compute[71628]: pseries Apr 17 17:29:29 user nova-compute[71628]: powernv9 Apr 17 17:29:29 user nova-compute[71628]: powernv Apr 17 17:29:29 user nova-compute[71628]: taihu Apr 17 17:29:29 user nova-compute[71628]: pseries-4.1 Apr 17 17:29:29 user nova-compute[71628]: mpc8544ds Apr 17 17:29:29 user nova-compute[71628]: pseries-6.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.5 Apr 17 17:29:29 user nova-compute[71628]: powernv10 Apr 17 17:29:29 user nova-compute[71628]: pseries-xenial Apr 17 17:29:29 user nova-compute[71628]: pseries-4.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-6.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-yakkety Apr 17 17:29:29 user nova-compute[71628]: pseries-2.6 Apr 17 17:29:29 user nova-compute[71628]: ppce500 Apr 17 17:29:29 user nova-compute[71628]: pseries-bionic-sxxm Apr 17 17:29:29 user nova-compute[71628]: pseries-2.7 Apr 17 17:29:29 user nova-compute[71628]: pseries-3.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-5.0 Apr 17 17:29:29 user nova-compute[71628]: 40p Apr 17 17:29:29 user nova-compute[71628]: pseries-2.8 Apr 17 17:29:29 user nova-compute[71628]: pegasos2 Apr 17 17:29:29 user nova-compute[71628]: pseries-hirsute Apr 17 17:29:29 user nova-compute[71628]: pseries-3.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-5.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-eoan Apr 17 17:29:29 user nova-compute[71628]: pseries-2.9 Apr 17 17:29:29 user nova-compute[71628]: pseries-zesty Apr 17 17:29:29 user nova-compute[71628]: bamboo Apr 17 17:29:29 user nova-compute[71628]: pseries-groovy Apr 17 17:29:29 user nova-compute[71628]: pseries-focal Apr 17 17:29:29 user nova-compute[71628]: g3beige Apr 17 17:29:29 user nova-compute[71628]: pseries-5.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-disco Apr 17 17:29:29 user nova-compute[71628]: pseries-2.12-sxxm Apr 17 17:29:29 user nova-compute[71628]: pseries-2.10 Apr 17 17:29:29 user nova-compute[71628]: virtex-ml507 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.11 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-cosmic Apr 17 17:29:29 user nova-compute[71628]: pseries-bionic Apr 17 17:29:29 user nova-compute[71628]: pseries-2.12 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.2 Apr 17 17:29:29 user nova-compute[71628]: mac99 Apr 17 17:29:29 user nova-compute[71628]: pseries-impish Apr 17 17:29:29 user nova-compute[71628]: pseries-artful Apr 17 17:29:29 user nova-compute[71628]: sam460ex Apr 17 17:29:29 user nova-compute[71628]: ref405ep Apr 17 17:29:29 user nova-compute[71628]: pseries-2.3 Apr 17 17:29:29 user nova-compute[71628]: powernv8 Apr 17 17:29:29 user nova-compute[71628]: pseries-4.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-6.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.4 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-ppc64le Apr 17 17:29:29 user nova-compute[71628]: pseries-jammy Apr 17 17:29:29 user nova-compute[71628]: pseries Apr 17 17:29:29 user nova-compute[71628]: powernv9 Apr 17 17:29:29 user nova-compute[71628]: powernv Apr 17 17:29:29 user nova-compute[71628]: taihu Apr 17 17:29:29 user nova-compute[71628]: pseries-4.1 Apr 17 17:29:29 user nova-compute[71628]: mpc8544ds Apr 17 17:29:29 user nova-compute[71628]: pseries-6.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.5 Apr 17 17:29:29 user nova-compute[71628]: powernv10 Apr 17 17:29:29 user nova-compute[71628]: pseries-xenial Apr 17 17:29:29 user nova-compute[71628]: pseries-4.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-6.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-yakkety Apr 17 17:29:29 user nova-compute[71628]: pseries-2.6 Apr 17 17:29:29 user nova-compute[71628]: ppce500 Apr 17 17:29:29 user nova-compute[71628]: pseries-bionic-sxxm Apr 17 17:29:29 user nova-compute[71628]: pseries-2.7 Apr 17 17:29:29 user nova-compute[71628]: pseries-3.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-5.0 Apr 17 17:29:29 user nova-compute[71628]: 40p Apr 17 17:29:29 user nova-compute[71628]: pseries-2.8 Apr 17 17:29:29 user nova-compute[71628]: pegasos2 Apr 17 17:29:29 user nova-compute[71628]: pseries-hirsute Apr 17 17:29:29 user nova-compute[71628]: pseries-3.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-5.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-eoan Apr 17 17:29:29 user nova-compute[71628]: pseries-2.9 Apr 17 17:29:29 user nova-compute[71628]: pseries-zesty Apr 17 17:29:29 user nova-compute[71628]: bamboo Apr 17 17:29:29 user nova-compute[71628]: pseries-groovy Apr 17 17:29:29 user nova-compute[71628]: pseries-focal Apr 17 17:29:29 user nova-compute[71628]: g3beige Apr 17 17:29:29 user nova-compute[71628]: pseries-5.2 Apr 17 17:29:29 user nova-compute[71628]: pseries-disco Apr 17 17:29:29 user nova-compute[71628]: pseries-2.12-sxxm Apr 17 17:29:29 user nova-compute[71628]: pseries-2.10 Apr 17 17:29:29 user nova-compute[71628]: virtex-ml507 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.11 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.1 Apr 17 17:29:29 user nova-compute[71628]: pseries-cosmic Apr 17 17:29:29 user nova-compute[71628]: pseries-bionic Apr 17 17:29:29 user nova-compute[71628]: pseries-2.12 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.2 Apr 17 17:29:29 user nova-compute[71628]: mac99 Apr 17 17:29:29 user nova-compute[71628]: pseries-impish Apr 17 17:29:29 user nova-compute[71628]: pseries-artful Apr 17 17:29:29 user nova-compute[71628]: sam460ex Apr 17 17:29:29 user nova-compute[71628]: ref405ep Apr 17 17:29:29 user nova-compute[71628]: pseries-2.3 Apr 17 17:29:29 user nova-compute[71628]: powernv8 Apr 17 17:29:29 user nova-compute[71628]: pseries-4.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-6.0 Apr 17 17:29:29 user nova-compute[71628]: pseries-2.4 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-riscv32 Apr 17 17:29:29 user nova-compute[71628]: spike Apr 17 17:29:29 user nova-compute[71628]: opentitan Apr 17 17:29:29 user nova-compute[71628]: sifive_u Apr 17 17:29:29 user nova-compute[71628]: sifive_e Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-riscv64 Apr 17 17:29:29 user nova-compute[71628]: spike Apr 17 17:29:29 user nova-compute[71628]: microchip-icicle-kit Apr 17 17:29:29 user nova-compute[71628]: sifive_u Apr 17 17:29:29 user nova-compute[71628]: shakti_c Apr 17 17:29:29 user nova-compute[71628]: sifive_e Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-s390x Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-jammy Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-4.0 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-5.2 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-artful Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-3.1 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-groovy Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-hirsute Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-disco Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.12 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.6 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-yakkety Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-eoan Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.9 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-6.0 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-5.1 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-3.0 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-4.2 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.5 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.11 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-xenial Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-focal Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.8 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-impish Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-bionic Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-5.0 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-6.2 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-zesty Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-4.1 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-cosmic Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.4 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.10 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-2.7 Apr 17 17:29:29 user nova-compute[71628]: s390-ccw-virtio-6.1 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-sh4 Apr 17 17:29:29 user nova-compute[71628]: shix Apr 17 17:29:29 user nova-compute[71628]: r2d Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-sh4eb Apr 17 17:29:29 user nova-compute[71628]: shix Apr 17 17:29:29 user nova-compute[71628]: r2d Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-sparc Apr 17 17:29:29 user nova-compute[71628]: SS-5 Apr 17 17:29:29 user nova-compute[71628]: SS-20 Apr 17 17:29:29 user nova-compute[71628]: LX Apr 17 17:29:29 user nova-compute[71628]: SPARCClassic Apr 17 17:29:29 user nova-compute[71628]: leon3_generic Apr 17 17:29:29 user nova-compute[71628]: SPARCbook Apr 17 17:29:29 user nova-compute[71628]: SS-4 Apr 17 17:29:29 user nova-compute[71628]: SS-600MP Apr 17 17:29:29 user nova-compute[71628]: SS-10 Apr 17 17:29:29 user nova-compute[71628]: Voyager Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-sparc64 Apr 17 17:29:29 user nova-compute[71628]: sun4u Apr 17 17:29:29 user nova-compute[71628]: niagara Apr 17 17:29:29 user nova-compute[71628]: sun4v Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 64 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-x86_64 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy Apr 17 17:29:29 user nova-compute[71628]: ubuntu Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-impish-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.12 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-xenial Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.2 Apr 17 17:29:29 user nova-compute[71628]: pc Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.5 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-hirsute Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-focal Apr 17 17:29:29 user nova-compute[71628]: pc-q35-xenial Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.2 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.5 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-eoan-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-zesty Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-disco-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-groovy Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-groovy Apr 17 17:29:29 user nova-compute[71628]: pc-q35-artful Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-trusty Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.2 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-focal-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-eoan-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-bionic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-artful Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-yakkety Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.4 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-cosmic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.10 Apr 17 17:29:29 user nova-compute[71628]: x-remote Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.7 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.9 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.11 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-3.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy Apr 17 17:29:29 user nova-compute[71628]: ubuntu-q35 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.4 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-eoan Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.9 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-bionic-hpb Apr 17 17:29:29 user nova-compute[71628]: isapc Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.4 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-cosmic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.6 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-3.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-bionic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-disco-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-cosmic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.12 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-bionic Apr 17 17:29:29 user nova-compute[71628]: pc-q35-groovy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-disco Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-cosmic-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.1 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-wily Apr 17 17:29:29 user nova-compute[71628]: pc-q35-impish Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.6 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-impish Apr 17 17:29:29 user nova-compute[71628]: pc-q35-impish-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-q35-hirsute Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.0.1 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-hirsute-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-1.6 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-5.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.8 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.10 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-3.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-zesty Apr 17 17:29:29 user nova-compute[71628]: pc-q35-4.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-focal Apr 17 17:29:29 user nova-compute[71628]: microvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.3 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-disco Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-focal-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-4.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-groovy-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-hirsute-hpb Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-5.0 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-2.8 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.2 Apr 17 17:29:29 user nova-compute[71628]: q35 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-eoan Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.5 Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-3.0 Apr 17 17:29:29 user nova-compute[71628]: pc-q35-yakkety Apr 17 17:29:29 user nova-compute[71628]: pc-q35-2.11 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-xtensa Apr 17 17:29:29 user nova-compute[71628]: sim Apr 17 17:29:29 user nova-compute[71628]: kc705 Apr 17 17:29:29 user nova-compute[71628]: ml605 Apr 17 17:29:29 user nova-compute[71628]: ml605-nommu Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: lx60-nommu Apr 17 17:29:29 user nova-compute[71628]: lx200 Apr 17 17:29:29 user nova-compute[71628]: lx200-nommu Apr 17 17:29:29 user nova-compute[71628]: lx60 Apr 17 17:29:29 user nova-compute[71628]: kc705-nommu Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: hvm Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: 32 Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-xtensaeb Apr 17 17:29:29 user nova-compute[71628]: sim Apr 17 17:29:29 user nova-compute[71628]: kc705 Apr 17 17:29:29 user nova-compute[71628]: ml605 Apr 17 17:29:29 user nova-compute[71628]: ml605-nommu Apr 17 17:29:29 user nova-compute[71628]: virt Apr 17 17:29:29 user nova-compute[71628]: lx60-nommu Apr 17 17:29:29 user nova-compute[71628]: lx200 Apr 17 17:29:29 user nova-compute[71628]: lx200-nommu Apr 17 17:29:29 user nova-compute[71628]: lx60 Apr 17 17:29:29 user nova-compute[71628]: kc705-nommu Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for i686 via machine types: {'ubuntu', 'ubuntu-q35', 'pc', 'q35'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-i386 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy Apr 17 17:29:29 user nova-compute[71628]: i686 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: ide Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-i386 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy Apr 17 17:29:29 user nova-compute[71628]: i686 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-i386 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.2 Apr 17 17:29:29 user nova-compute[71628]: i686 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: ide Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-i386 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.2 Apr 17 17:29:29 user nova-compute[71628]: i686 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for ppc64le via machine types: {'powernv', 'pseries'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu', 'ubuntu-q35', 'pc', 'q35'} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-x86_64 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-jammy Apr 17 17:29:29 user nova-compute[71628]: x86_64 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: efi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: ide Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-x86_64 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-q35-jammy Apr 17 17:29:29 user nova-compute[71628]: x86_64 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: efi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-x86_64 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-i440fx-6.2 Apr 17 17:29:29 user nova-compute[71628]: x86_64 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: efi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: ide Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/bin/qemu-system-x86_64 Apr 17 17:29:29 user nova-compute[71628]: kvm Apr 17 17:29:29 user nova-compute[71628]: pc-q35-6.2 Apr 17 17:29:29 user nova-compute[71628]: x86_64 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: efi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 17 17:29:29 user nova-compute[71628]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: rom Apr 17 17:29:29 user nova-compute[71628]: pflash Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: yes Apr 17 17:29:29 user nova-compute[71628]: no Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: on Apr 17 17:29:29 user nova-compute[71628]: off Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: Intel Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: qemu64 Apr 17 17:29:29 user nova-compute[71628]: qemu32 Apr 17 17:29:29 user nova-compute[71628]: phenom Apr 17 17:29:29 user nova-compute[71628]: pentium3 Apr 17 17:29:29 user nova-compute[71628]: pentium2 Apr 17 17:29:29 user nova-compute[71628]: pentium Apr 17 17:29:29 user nova-compute[71628]: n270 Apr 17 17:29:29 user nova-compute[71628]: kvm64 Apr 17 17:29:29 user nova-compute[71628]: kvm32 Apr 17 17:29:29 user nova-compute[71628]: coreduo Apr 17 17:29:29 user nova-compute[71628]: core2duo Apr 17 17:29:29 user nova-compute[71628]: athlon Apr 17 17:29:29 user nova-compute[71628]: Westmere-IBRS Apr 17 17:29:29 user nova-compute[71628]: Westmere Apr 17 17:29:29 user nova-compute[71628]: Snowridge Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Server Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client-IBRS Apr 17 17:29:29 user nova-compute[71628]: Skylake-Client Apr 17 17:29:29 user nova-compute[71628]: SandyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: SandyBridge Apr 17 17:29:29 user nova-compute[71628]: Penryn Apr 17 17:29:29 user nova-compute[71628]: Opteron_G5 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G4 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G3 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G2 Apr 17 17:29:29 user nova-compute[71628]: Opteron_G1 Apr 17 17:29:29 user nova-compute[71628]: Nehalem-IBRS Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: IvyBridge-IBRS Apr 17 17:29:29 user nova-compute[71628]: IvyBridge Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Server Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client-noTSX Apr 17 17:29:29 user nova-compute[71628]: Icelake-Client Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Haswell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Haswell Apr 17 17:29:29 user nova-compute[71628]: EPYC-Rome Apr 17 17:29:29 user nova-compute[71628]: EPYC-Milan Apr 17 17:29:29 user nova-compute[71628]: EPYC-IBPB Apr 17 17:29:29 user nova-compute[71628]: EPYC Apr 17 17:29:29 user nova-compute[71628]: Dhyana Apr 17 17:29:29 user nova-compute[71628]: Cooperlake Apr 17 17:29:29 user nova-compute[71628]: Conroe Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server-noTSX Apr 17 17:29:29 user nova-compute[71628]: Cascadelake-Server Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell-noTSX Apr 17 17:29:29 user nova-compute[71628]: Broadwell-IBRS Apr 17 17:29:29 user nova-compute[71628]: Broadwell Apr 17 17:29:29 user nova-compute[71628]: 486 Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: file Apr 17 17:29:29 user nova-compute[71628]: anonymous Apr 17 17:29:29 user nova-compute[71628]: memfd Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: disk Apr 17 17:29:29 user nova-compute[71628]: cdrom Apr 17 17:29:29 user nova-compute[71628]: floppy Apr 17 17:29:29 user nova-compute[71628]: lun Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: fdc Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: sata Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: sdl Apr 17 17:29:29 user nova-compute[71628]: vnc Apr 17 17:29:29 user nova-compute[71628]: spice Apr 17 17:29:29 user nova-compute[71628]: egl-headless Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: subsystem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: default Apr 17 17:29:29 user nova-compute[71628]: mandatory Apr 17 17:29:29 user nova-compute[71628]: requisite Apr 17 17:29:29 user nova-compute[71628]: optional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: usb Apr 17 17:29:29 user nova-compute[71628]: pci Apr 17 17:29:29 user nova-compute[71628]: scsi Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: virtio Apr 17 17:29:29 user nova-compute[71628]: virtio-transitional Apr 17 17:29:29 user nova-compute[71628]: virtio-non-transitional Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: random Apr 17 17:29:29 user nova-compute[71628]: egd Apr 17 17:29:29 user nova-compute[71628]: builtin Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: path Apr 17 17:29:29 user nova-compute[71628]: handle Apr 17 17:29:29 user nova-compute[71628]: virtiofs Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: tpm-tis Apr 17 17:29:29 user nova-compute[71628]: tpm-crb Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: passthrough Apr 17 17:29:29 user nova-compute[71628]: emulator Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71628) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71628) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Checking secure boot support for host arch (x86_64) {{(pid=71628) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Checking secure boot support for host arch (x86_64) {{(pid=71628) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Checking secure boot support for host arch (x86_64) {{(pid=71628) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 17:29:29 user nova-compute[71628]: INFO nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Secure Boot support detected Apr 17 17:29:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] cpu compare xml: Apr 17 17:29:29 user nova-compute[71628]: Nehalem Apr 17 17:29:29 user nova-compute[71628]: Apr 17 17:29:29 user nova-compute[71628]: {{(pid=71628) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 17 17:29:29 user nova-compute[71628]: INFO nova.virt.node [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Generated node identity d1bd72d4-058c-4e3c-95bb-8ce522bd5058 Apr 17 17:29:29 user nova-compute[71628]: INFO nova.virt.node [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Wrote node identity d1bd72d4-058c-4e3c-95bb-8ce522bd5058 to /opt/stack/data/nova/compute_id Apr 17 17:29:29 user nova-compute[71628]: WARNING nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Compute nodes ['d1bd72d4-058c-4e3c-95bb-8ce522bd5058'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 17 17:29:30 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 17 17:29:30 user nova-compute[71628]: WARNING nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 17 17:29:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:29:30 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:29:30 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Hypervisor/Node resource view: name=user free_ram=10822MB free_disk=26.978775024414062GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:29:30 user nova-compute[71628]: WARNING nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] No compute node record for user:d1bd72d4-058c-4e3c-95bb-8ce522bd5058: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host d1bd72d4-058c-4e3c-95bb-8ce522bd5058 could not be found. Apr 17 17:29:30 user nova-compute[71628]: INFO nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Compute node record created for user:user with uuid: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:29:30 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [req-9e5ee337-7e56-467d-b24b-ab0acc643fd0] Created resource provider record via placement API for resource provider with UUID d1bd72d4-058c-4e3c-95bb-8ce522bd5058 and name user. Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71628) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 17 17:29:30 user nova-compute[71628]: INFO nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] kernel doesn't support AMD SEV Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Libvirt baseline CPU Apr 17 17:29:30 user nova-compute[71628]: x86_64 Apr 17 17:29:30 user nova-compute[71628]: Nehalem Apr 17 17:29:30 user nova-compute[71628]: Intel Apr 17 17:29:30 user nova-compute[71628]: Apr 17 17:29:30 user nova-compute[71628]: Apr 17 17:29:30 user nova-compute[71628]: {{(pid=71628) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Updated inventory for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Updating resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 generation from 0 to 1 during operation: update_inventory {{(pid=71628) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:29:30 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Updating resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 generation from 1 to 2 during operation: update_traits {{(pid=71628) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 17 17:29:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:29:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:29:31 user nova-compute[71628]: DEBUG nova.service [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Creating RPC server for service compute {{(pid=71628) start /opt/stack/nova/nova/service.py:182}} Apr 17 17:29:31 user nova-compute[71628]: DEBUG nova.service [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Join ServiceGroup membership for this service compute {{(pid=71628) start /opt/stack/nova/nova/service.py:199}} Apr 17 17:29:31 user nova-compute[71628]: DEBUG nova.servicegroup.drivers.db [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71628) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 17 17:29:53 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:29:53 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:30:22 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:30:22 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=10218MB free_disk=26.892528533935547GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:30:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:31:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:31:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=10185MB free_disk=26.93852996826172GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:31:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:31:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:31:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:31:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:31:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:31:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:32:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:32:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=10149MB free_disk=26.714744567871094GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:32:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:32:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:32:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:32:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:32:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:32:26 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:33:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=9440MB free_disk=26.736934661865234GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:33:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:26 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:26 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:33:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:33:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:33:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:33:32 user nova-compute[71628]: INFO nova.compute.claims [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Claim successful on node user Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:33:32 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:33:32 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Creating image(s) Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "/opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "/opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "/opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.part --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG nova.policy [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '33f713b19cdf41bc9d56ee7cea3722ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5413283bcdd4120a73a64d76459853a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.part --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG nova.virt.images [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] 82e42adf-a9f9-4d9b-9bd0-106a738b1690 was qcow2, converting to raw {{(pid=71628) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:33:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.part /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.converted {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.part /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.converted" returned: 0 in 0.295s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.converted --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062.converted --force-share --output=json" returned: 0 in 0.183s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.437s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:34 user nova-compute[71628]: INFO oslo.privsep.daemon [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpu6fbn8a7/privsep.sock'] Apr 17 17:33:34 user sudo[80346]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpu6fbn8a7/privsep.sock Apr 17 17:33:34 user sudo[80346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:33:34 user nova-compute[71628]: INFO nova.compute.claims [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Claim successful on node user Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:33:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:33:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:33:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:33:35 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:33:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Creating image(s) Apr 17 17:33:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "/opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "/opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.004s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "/opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:35 user nova-compute[71628]: DEBUG nova.policy [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cbe09b575424462398089e0895c86828', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '618ff48e86e344939d81482da314300e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:33:36 user sudo[80346]: pam_unix(sudo:session): session closed for user root Apr 17 17:33:36 user nova-compute[71628]: INFO oslo.privsep.daemon [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Spawned new privsep daemon via rootwrap Apr 17 17:33:36 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 17 17:33:36 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 17 17:33:36 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 17 17:33:36 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80352 Apr 17 17:33:36 user nova-compute[71628]: WARNING oslo_privsep.priv_context [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] privsep daemon already running Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.178s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Successfully created port: dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.175s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk 1073741824" returned: 0 in 0.045s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.225s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.184s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.145s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.159s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.005s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk 1073741824" returned: 0 in 0.060s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.235s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Cannot resize image /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.objects.instance [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lazy-loading 'migration_context' on Instance uuid ddeef235-f0ed-411b-8bf5-9a880394bb36 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:33:36 user nova-compute[71628]: INFO nova.compute.claims [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Claim successful on node user Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Ensure instance console log exists: /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.125s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Cannot resize image /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.objects.instance [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'migration_context' on Instance uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Ensure instance console log exists: /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Successfully created port: 8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.477s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.406s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:33:37 user nova-compute[71628]: INFO nova.compute.claims [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Claim successful on node user Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:33:37 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:33:37 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Creating image(s) Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "/opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.policy [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cbcda57441d43e0bb8dfee4768df2a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70d10a379e4e420e9c66476ae0b10507', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.353s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:33:37 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.171s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk 1073741824" returned: 0 in 0.051s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.225s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:33:37 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Creating image(s) Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "/opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "/opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "/opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.157s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.145s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Cannot resize image /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.objects.instance [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'migration_context' on Instance uuid f2ab32f4-ce85-49d6-bf7d-a9219789a545 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.policy [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a6e712adada44971a7fcac5fe1881883', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b5443ac3e3e45888d6a42642e53c687', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Ensure instance console log exists: /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk 1073741824" returned: 0 in 0.052s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.204s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.122s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Cannot resize image /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.objects.instance [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lazy-loading 'migration_context' on Instance uuid e4d62df0-41e5-4351-a4de-5c0d88a9ab5f {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Ensure instance console log exists: /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Successfully updated port: dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:33:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquired lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Successfully updated port: 8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquired lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-changed-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Refreshing instance network info cache due to event network-changed-dcd09a73-2587-46b6-95cc-57f1505c9993. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] Acquiring lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Successfully created port: 725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-changed-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Refreshing instance network info cache due to event network-changed-8ab8313d-a088-414c-9d46-1d3385707c18. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:33:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] Acquiring lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Updating instance_info_cache with network_info: [{"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Releasing lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Instance network_info: |[{"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] Acquired lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Refreshing network info cache for port 8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Start _get_guest_xml network_info=[{"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:33:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1484446540',display_name='tempest-DeleteServersTestJSON-server-1484446540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1484446540',id=2,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='618ff48e86e344939d81482da314300e',ramdisk_id='',reservation_id='r-vh6hir9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1154435592',owner_user_name='tempest-DeleteServersTestJSON-1154435592-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:35Z,user_data=None,user_id='cbe09b575424462398089e0895c86828',uuid=ddeef235-f0ed-411b-8bf5-9a880394bb36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converting VIF {"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.objects.instance [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lazy-loading 'pci_devices' on Instance uuid ddeef235-f0ed-411b-8bf5-9a880394bb36 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] End _get_guest_xml xml= Apr 17 17:33:41 user nova-compute[71628]: ddeef235-f0ed-411b-8bf5-9a880394bb36 Apr 17 17:33:41 user nova-compute[71628]: instance-00000002 Apr 17 17:33:41 user nova-compute[71628]: 131072 Apr 17 17:33:41 user nova-compute[71628]: 1 Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: tempest-DeleteServersTestJSON-server-1484446540 Apr 17 17:33:41 user nova-compute[71628]: 2023-04-17 17:33:41 Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: 128 Apr 17 17:33:41 user nova-compute[71628]: 1 Apr 17 17:33:41 user nova-compute[71628]: 0 Apr 17 17:33:41 user nova-compute[71628]: 0 Apr 17 17:33:41 user nova-compute[71628]: 1 Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: tempest-DeleteServersTestJSON-1154435592-project-member Apr 17 17:33:41 user nova-compute[71628]: tempest-DeleteServersTestJSON-1154435592 Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: OpenStack Foundation Apr 17 17:33:41 user nova-compute[71628]: OpenStack Nova Apr 17 17:33:41 user nova-compute[71628]: 0.0.0 Apr 17 17:33:41 user nova-compute[71628]: ddeef235-f0ed-411b-8bf5-9a880394bb36 Apr 17 17:33:41 user nova-compute[71628]: ddeef235-f0ed-411b-8bf5-9a880394bb36 Apr 17 17:33:41 user nova-compute[71628]: Virtual Machine Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: hvm Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Nehalem Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: /dev/urandom Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: Apr 17 17:33:41 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1484446540',display_name='tempest-DeleteServersTestJSON-server-1484446540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1484446540',id=2,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='618ff48e86e344939d81482da314300e',ramdisk_id='',reservation_id='r-vh6hir9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1154435592',owner_user_name='tempest-DeleteServersTestJSON-1154435592-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:35Z,user_data=None,user_id='cbe09b575424462398089e0895c86828',uuid=ddeef235-f0ed-411b-8bf5-9a880394bb36,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converting VIF {"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG os_vif [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Created schema index Interface.name {{(pid=71628) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Created schema index Port.name {{(pid=71628) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Created schema index Bridge.name {{(pid=71628) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [POLLOUT] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:33:41 user nova-compute[71628]: INFO oslo.privsep.daemon [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpiqhaies5/privsep.sock'] Apr 17 17:33:41 user sudo[80424]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpiqhaies5/privsep.sock Apr 17 17:33:41 user sudo[80424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updating instance_info_cache with network_info: [{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Releasing lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Instance network_info: |[{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] Acquired lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Refreshing network info cache for port dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Start _get_guest_xml network_info=[{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:33:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-720085354',display_name='tempest-ServersNegativeTestJSON-server-720085354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-720085354',id=1,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-3uq27e85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:33Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=b5fd68bf-3827-41f7-9ffa-ce1060e95f58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.objects.instance [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'pci_devices' on Instance uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] End _get_guest_xml xml= Apr 17 17:33:42 user nova-compute[71628]: b5fd68bf-3827-41f7-9ffa-ce1060e95f58 Apr 17 17:33:42 user nova-compute[71628]: instance-00000001 Apr 17 17:33:42 user nova-compute[71628]: 131072 Apr 17 17:33:42 user nova-compute[71628]: 1 Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: tempest-ServersNegativeTestJSON-server-720085354 Apr 17 17:33:42 user nova-compute[71628]: 2023-04-17 17:33:42 Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: 128 Apr 17 17:33:42 user nova-compute[71628]: 1 Apr 17 17:33:42 user nova-compute[71628]: 0 Apr 17 17:33:42 user nova-compute[71628]: 0 Apr 17 17:33:42 user nova-compute[71628]: 1 Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: tempest-ServersNegativeTestJSON-1842710030-project-member Apr 17 17:33:42 user nova-compute[71628]: tempest-ServersNegativeTestJSON-1842710030 Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: OpenStack Foundation Apr 17 17:33:42 user nova-compute[71628]: OpenStack Nova Apr 17 17:33:42 user nova-compute[71628]: 0.0.0 Apr 17 17:33:42 user nova-compute[71628]: b5fd68bf-3827-41f7-9ffa-ce1060e95f58 Apr 17 17:33:42 user nova-compute[71628]: b5fd68bf-3827-41f7-9ffa-ce1060e95f58 Apr 17 17:33:42 user nova-compute[71628]: Virtual Machine Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: hvm Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Nehalem Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: /dev/urandom Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: Apr 17 17:33:42 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-720085354',display_name='tempest-ServersNegativeTestJSON-server-720085354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-720085354',id=1,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-3uq27e85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:33Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=b5fd68bf-3827-41f7-9ffa-ce1060e95f58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG os_vif [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Successfully created port: f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Updated VIF entry in instance network info cache for port 8ab8313d-a088-414c-9d46-1d3385707c18. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Updating instance_info_cache with network_info: [{"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0af1dea4-5e24-409b-9381-f1fc39bbc941 req-a7b77211-4466-4cef-b099-9e18986db68a service nova] Releasing lock "refresh_cache-ddeef235-f0ed-411b-8bf5-9a880394bb36" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Successfully updated port: 725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquired lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-changed-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Refreshing instance network info cache due to event network-changed-725be64e-c050-49d6-a87d-5cb5b04e86c0. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:33:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] Acquiring lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:33:43 user sudo[80424]: pam_unix(sudo:session): session closed for user root Apr 17 17:33:43 user nova-compute[71628]: INFO oslo.privsep.daemon [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Spawned new privsep daemon via rootwrap Apr 17 17:33:43 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 17 17:33:43 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 17 17:33:43 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 17 17:33:43 user nova-compute[71628]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80427 Apr 17 17:33:43 user nova-compute[71628]: WARNING oslo_privsep.priv_context [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] privsep daemon already running Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updating instance_info_cache with network_info: [{"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Releasing lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Instance network_info: |[{"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] Acquired lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Refreshing network info cache for port 725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Start _get_guest_xml network_info=[{"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:33:43 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:43 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1291197301',display_name='tempest-AttachVolumeNegativeTest-server-1291197301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1291197301',id=4,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETy3qN91fAUY59vYTnM+st5lBmsgrYGghrdiDmNZuBShFM/gMR4GlDzNicctF6dMRMbYWda4SIaaWAx7hCS/iUHMA0EEUO+HKkLWyI2QTVq0VaormimlIiLwEnxEYg/qQ==',key_name='tempest-keypair-328622868',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-uj0nocan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=f2ab32f4-ce85-49d6-bf7d-a9219789a545,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.objects.instance [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'pci_devices' on Instance uuid f2ab32f4-ce85-49d6-bf7d-a9219789a545 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] End _get_guest_xml xml= Apr 17 17:33:43 user nova-compute[71628]: f2ab32f4-ce85-49d6-bf7d-a9219789a545 Apr 17 17:33:43 user nova-compute[71628]: instance-00000004 Apr 17 17:33:43 user nova-compute[71628]: 131072 Apr 17 17:33:43 user nova-compute[71628]: 1 Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-server-1291197301 Apr 17 17:33:43 user nova-compute[71628]: 2023-04-17 17:33:43 Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: 128 Apr 17 17:33:43 user nova-compute[71628]: 1 Apr 17 17:33:43 user nova-compute[71628]: 0 Apr 17 17:33:43 user nova-compute[71628]: 0 Apr 17 17:33:43 user nova-compute[71628]: 1 Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846-project-member Apr 17 17:33:43 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846 Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: OpenStack Foundation Apr 17 17:33:43 user nova-compute[71628]: OpenStack Nova Apr 17 17:33:43 user nova-compute[71628]: 0.0.0 Apr 17 17:33:43 user nova-compute[71628]: f2ab32f4-ce85-49d6-bf7d-a9219789a545 Apr 17 17:33:43 user nova-compute[71628]: f2ab32f4-ce85-49d6-bf7d-a9219789a545 Apr 17 17:33:43 user nova-compute[71628]: Virtual Machine Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: hvm Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Nehalem Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: /dev/urandom Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: Apr 17 17:33:43 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1291197301',display_name='tempest-AttachVolumeNegativeTest-server-1291197301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1291197301',id=4,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETy3qN91fAUY59vYTnM+st5lBmsgrYGghrdiDmNZuBShFM/gMR4GlDzNicctF6dMRMbYWda4SIaaWAx7hCS/iUHMA0EEUO+HKkLWyI2QTVq0VaormimlIiLwEnxEYg/qQ==',key_name='tempest-keypair-328622868',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-uj0nocan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=f2ab32f4-ce85-49d6-bf7d-a9219789a545,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG os_vif [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcd09a73-25, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdcd09a73-25, col_values=(('external_ids', {'iface-id': 'dcd09a73-2587-46b6-95cc-57f1505c9993', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:8f:aa', 'vm-uuid': 'b5fd68bf-3827-41f7-9ffa-ce1060e95f58'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: INFO os_vif [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ab8313d-a0, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ab8313d-a0, col_values=(('external_ids', {'iface-id': '8ab8313d-a088-414c-9d46-1d3385707c18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:98:cb', 'vm-uuid': 'ddeef235-f0ed-411b-8bf5-9a880394bb36'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: INFO os_vif [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap725be64e-c0, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap725be64e-c0, col_values=(('external_ids', {'iface-id': '725be64e-c050-49d6-a87d-5cb5b04e86c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:2e:09', 'vm-uuid': 'f2ab32f4-ce85-49d6-bf7d-a9219789a545'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:43 user nova-compute[71628]: INFO os_vif [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:33:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] No VIF found with MAC fa:16:3e:46:8f:aa, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] No VIF found with MAC fa:16:3e:d0:98:cb, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No VIF found with MAC fa:16:3e:d7:2e:09, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.network.neutron [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updated VIF entry in instance network info cache for port dcd09a73-2587-46b6-95cc-57f1505c9993. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG nova.network.neutron [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updating instance_info_cache with network_info: [{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bf100c0-c4dd-4ef4-8f99-a3e547ef0a50 req-5bf1bda1-6d57-499b-adf0-d852661c9a09 service nova] Releasing lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:45 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updated VIF entry in instance network info cache for port 725be64e-c050-49d6-a87d-5cb5b04e86c0. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:33:45 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updating instance_info_cache with network_info: [{"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeb639ca-d8d3-4e3c-a8a4-c14203d2db25 req-79448641-9d85-4d6d-8313-05d0a6154e7b service nova] Releasing lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Successfully updated port: f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquired lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-changed-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Refreshing instance network info cache due to event network-changed-f9b1ac87-92c9-4ca2-9721-54337c3c8811. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] Acquiring lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:33:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.neutron [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updating instance_info_cache with network_info: [{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Releasing lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Instance network_info: |[{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] Acquired lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.neutron [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Refreshing network info cache for port f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Start _get_guest_xml network_info=[{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:33:47 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:47 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-208750553',display_name='tempest-ServerStableDeviceRescueTest-server-208750553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-208750553',id=3,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAq+BzuxOK8loeWCX7+gm1vtBvHjKX/RB2/Ksbqc7d8Sd4w/uR+fvKlIbAKgPdXXHXmWGfsY644kMqiq2d9A+lV14DR/jIn5cY2HIczrH02adOJG295uKxk0lnYHrPw5Hw==',key_name='tempest-keypair-1425694538',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b5443ac3e3e45888d6a42642e53c687',ramdisk_id='',reservation_id='r-zdz897w0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1249806725',owner_user_name='tempest-ServerStableDeviceRescueTest-1249806725-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a6e712adada44971a7fcac5fe1881883',uuid=e4d62df0-41e5-4351-a4de-5c0d88a9ab5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converting VIF {"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.objects.instance [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lazy-loading 'pci_devices' on Instance uuid e4d62df0-41e5-4351-a4de-5c0d88a9ab5f {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] End _get_guest_xml xml= Apr 17 17:33:47 user nova-compute[71628]: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f Apr 17 17:33:47 user nova-compute[71628]: instance-00000003 Apr 17 17:33:47 user nova-compute[71628]: 131072 Apr 17 17:33:47 user nova-compute[71628]: 1 Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: tempest-ServerStableDeviceRescueTest-server-208750553 Apr 17 17:33:47 user nova-compute[71628]: 2023-04-17 17:33:47 Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: 128 Apr 17 17:33:47 user nova-compute[71628]: 1 Apr 17 17:33:47 user nova-compute[71628]: 0 Apr 17 17:33:47 user nova-compute[71628]: 0 Apr 17 17:33:47 user nova-compute[71628]: 1 Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: tempest-ServerStableDeviceRescueTest-1249806725-project-member Apr 17 17:33:47 user nova-compute[71628]: tempest-ServerStableDeviceRescueTest-1249806725 Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: OpenStack Foundation Apr 17 17:33:47 user nova-compute[71628]: OpenStack Nova Apr 17 17:33:47 user nova-compute[71628]: 0.0.0 Apr 17 17:33:47 user nova-compute[71628]: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f Apr 17 17:33:47 user nova-compute[71628]: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f Apr 17 17:33:47 user nova-compute[71628]: Virtual Machine Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: hvm Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Nehalem Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: /dev/urandom Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: Apr 17 17:33:47 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-208750553',display_name='tempest-ServerStableDeviceRescueTest-server-208750553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-208750553',id=3,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAq+BzuxOK8loeWCX7+gm1vtBvHjKX/RB2/Ksbqc7d8Sd4w/uR+fvKlIbAKgPdXXHXmWGfsY644kMqiq2d9A+lV14DR/jIn5cY2HIczrH02adOJG295uKxk0lnYHrPw5Hw==',key_name='tempest-keypair-1425694538',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b5443ac3e3e45888d6a42642e53c687',ramdisk_id='',reservation_id='r-zdz897w0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1249806725',owner_user_name='tempest-ServerStableDeviceRescueTest-1249806725-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:33:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a6e712adada44971a7fcac5fe1881883',uuid=e4d62df0-41e5-4351-a4de-5c0d88a9ab5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converting VIF {"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG os_vif [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9b1ac87-92, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9b1ac87-92, col_values=(('external_ids', {'iface-id': 'f9b1ac87-92c9-4ca2-9721-54337c3c8811', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:b7:3b', 'vm-uuid': 'e4d62df0-41e5-4351-a4de-5c0d88a9ab5f'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:47 user nova-compute[71628]: INFO os_vif [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] No VIF found with MAC fa:16:3e:22:b7:3b, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:33:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] No waiting events found dispatching network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:48 user nova-compute[71628]: WARNING nova.compute.manager [req-492bc3ad-0e88-410f-9a4c-94ced3ee6977 req-86bab26b-2198-479e-a96f-f7ee63fc135d service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received unexpected event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 for instance with vm_state building and task_state spawning. Apr 17 17:33:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:48 user nova-compute[71628]: WARNING nova.compute.manager [req-2dc3de52-0aba-480b-895e-20c676c0427a req-c15de21f-3bd6-41cb-a49a-b34d72ac4e91 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state building and task_state spawning. Apr 17 17:33:49 user nova-compute[71628]: DEBUG nova.network.neutron [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updated VIF entry in instance network info cache for port f9b1ac87-92c9-4ca2-9721-54337c3c8811. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:33:49 user nova-compute[71628]: DEBUG nova.network.neutron [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updating instance_info_cache with network_info: [{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:33:49 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7348bbec-913f-47ef-9897-a2ad26d20f19 req-b4ec98e3-0926-4861-9de0-70aae7a0b239 service nova] Releasing lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] No waiting events found dispatching network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:50 user nova-compute[71628]: WARNING nova.compute.manager [req-14a534e6-c575-4e6e-9551-1d91467479e4 req-c9756d17-7d10-42ae-99a7-02e364920a15 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received unexpected event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 for instance with vm_state building and task_state spawning. Apr 17 17:33:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:50 user nova-compute[71628]: WARNING nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state building and task_state spawning. Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] No waiting events found dispatching network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:50 user nova-compute[71628]: WARNING nova.compute.manager [req-0deca827-3f47-443f-9216-6748dda8153d req-9f534f54-3bb0-4da0-93d8-205600375dfb service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received unexpected event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 for instance with vm_state building and task_state spawning. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] VM Resumed (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Instance spawned successfully. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Instance spawned successfully. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] VM Started (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Took 18.40 seconds to spawn the instance on the hypervisor. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] VM Resumed (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Instance spawned successfully. Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Took 19.29 seconds to build instance. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Took 13.98 seconds to spawn the instance on the hypervisor. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f7b1becd-42af-4847-b584-ac0da0f94931 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.502s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] VM Started (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Took 16.69 seconds to spawn the instance on the hypervisor. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] VM Resumed (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Took 14.97 seconds to build instance. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3714b783-9889-4e0b-b0cb-1a84006733df tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.108s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] VM Started (Lifecycle Event) Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:51 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:51 user nova-compute[71628]: INFO nova.compute.manager [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Took 17.50 seconds to build instance. Apr 17 17:33:51 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d587e2b8-cfd7-42a1-b8c0-7811fd53e0de tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.699s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] VM Resumed (Lifecycle Event) Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Instance spawned successfully. Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] VM Started (Lifecycle Event) Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Took 14.46 seconds to spawn the instance on the hypervisor. Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:33:52 user nova-compute[71628]: INFO nova.compute.manager [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Took 15.63 seconds to build instance. Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-abad4f1d-392e-4e0a-9ae4-21a9f23a40ae tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.821s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] No waiting events found dispatching network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:52 user nova-compute[71628]: WARNING nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received unexpected event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 for instance with vm_state active and task_state None. Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:52 user nova-compute[71628]: DEBUG nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] No waiting events found dispatching network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:52 user nova-compute[71628]: WARNING nova.compute.manager [req-254a10bf-94f8-4f67-b8f0-0ce317604828 req-588ddb45-741a-4865-ba3a-42881279d57d service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received unexpected event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 for instance with vm_state active and task_state None. Apr 17 17:33:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG nova.compute.manager [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG nova.compute.manager [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] No waiting events found dispatching network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:33:55 user nova-compute[71628]: WARNING nova.compute.manager [req-a3d7595c-a396-44fa-8241-57ea0cf6e015 req-785b1d54-cf94-4beb-9e70-59ca6a80214e service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received unexpected event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 for instance with vm_state active and task_state None. Apr 17 17:33:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:58 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:33:58 user nova-compute[71628]: INFO nova.compute.claims [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Claim successful on node user Apr 17 17:33:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.799s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:33:59 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:33:59 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Creating image(s) Apr 17 17:33:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "/opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "/opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "/opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG nova.policy [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb9f6038c3d94f4b8176f52308996012', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd1e8586846543c88d468bb6b705d4a6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:33:59 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.213s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.008s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.225s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk 1073741824" returned: 0 in 0.067s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.299s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.222s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Cannot resize image /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG nova.objects.instance [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'migration_context' on Instance uuid 0711a965-58ba-4238-aa35-b7f3d762c97d {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Ensure instance console log exists: /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:01 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Successfully created port: 358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Successfully updated port: 358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquired lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG nova.compute.manager [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-changed-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG nova.compute.manager [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Refreshing instance network info cache due to event network-changed-358fa886-02f3-433a-a1af-d4d2bff8be35. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] Acquiring lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:04 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updating instance_info_cache with network_info: [{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Releasing lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Instance network_info: |[{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] Acquired lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.neutron [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Refreshing network info cache for port 358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Start _get_guest_xml network_info=[{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:06 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:06 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-796323267',display_name='tempest-VolumesAdminNegativeTest-server-796323267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-796323267',id=5,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNA3nzT/eQwwUVq7FVq+WQky0pPjQAJUFSsfHG4AY4cBLpdgzedNQS6Cc0CHRBOmzmW0iPUkSyxk2SpIdk+jLjZmz+UsqyxxI97a2YS5M9WcvyIhUz4nuSC3800u6FkZg==',key_name='tempest-keypair-1999371266',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-rzww6pwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cb9f6038c3d94f4b8176f52308996012',uuid=0711a965-58ba-4238-aa35-b7f3d762c97d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.objects.instance [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'pci_devices' on Instance uuid 0711a965-58ba-4238-aa35-b7f3d762c97d {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] End _get_guest_xml xml= Apr 17 17:34:06 user nova-compute[71628]: 0711a965-58ba-4238-aa35-b7f3d762c97d Apr 17 17:34:06 user nova-compute[71628]: instance-00000005 Apr 17 17:34:06 user nova-compute[71628]: 131072 Apr 17 17:34:06 user nova-compute[71628]: 1 Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-server-796323267 Apr 17 17:34:06 user nova-compute[71628]: 2023-04-17 17:34:06 Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: 128 Apr 17 17:34:06 user nova-compute[71628]: 1 Apr 17 17:34:06 user nova-compute[71628]: 0 Apr 17 17:34:06 user nova-compute[71628]: 0 Apr 17 17:34:06 user nova-compute[71628]: 1 Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-1858597906-project-member Apr 17 17:34:06 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-1858597906 Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:06 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:06 user nova-compute[71628]: 0.0.0 Apr 17 17:34:06 user nova-compute[71628]: 0711a965-58ba-4238-aa35-b7f3d762c97d Apr 17 17:34:06 user nova-compute[71628]: 0711a965-58ba-4238-aa35-b7f3d762c97d Apr 17 17:34:06 user nova-compute[71628]: Virtual Machine Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: hvm Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Nehalem Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: /dev/urandom Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: Apr 17 17:34:06 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-796323267',display_name='tempest-VolumesAdminNegativeTest-server-796323267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-796323267',id=5,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNA3nzT/eQwwUVq7FVq+WQky0pPjQAJUFSsfHG4AY4cBLpdgzedNQS6Cc0CHRBOmzmW0iPUkSyxk2SpIdk+jLjZmz+UsqyxxI97a2YS5M9WcvyIhUz4nuSC3800u6FkZg==',key_name='tempest-keypair-1999371266',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-rzww6pwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cb9f6038c3d94f4b8176f52308996012',uuid=0711a965-58ba-4238-aa35-b7f3d762c97d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG os_vif [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap358fa886-02, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap358fa886-02, col_values=(('external_ids', {'iface-id': '358fa886-02f3-433a-a1af-d4d2bff8be35', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:28:88', 'vm-uuid': '0711a965-58ba-4238-aa35-b7f3d762c97d'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:06 user nova-compute[71628]: INFO os_vif [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:06 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] No VIF found with MAC fa:16:3e:a1:28:88, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:34:08 user nova-compute[71628]: INFO nova.compute.claims [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Claim successful on node user Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] No waiting events found dispatching network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:09 user nova-compute[71628]: WARNING nova.compute.manager [req-602335ef-27db-4a30-b203-30ac63fa2d88 req-4cc16822-60ef-4286-9fd2-2f24412032b6 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received unexpected event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 for instance with vm_state building and task_state spawning. Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.587s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:34:09 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.network.neutron [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updated VIF entry in instance network info cache for port 358fa886-02f3-433a-a1af-d4d2bff8be35. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.network.neutron [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updating instance_info_cache with network_info: [{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b0c6081a-693d-4cc5-8de0-614635473e48 req-739d2ff8-d890-4b16-ac8d-9e7703e661ee service nova] Releasing lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:34:09 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Creating image(s) Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "/opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "/opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "/opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.part --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.policy [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40fcde98cd664f57a18b27bfa71111e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a6cbba34e8f449c39da5f07463fc4696', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.part --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.virt.images [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] 3e8f092e-58b0-4283-9790-92d661c52d35 was qcow2, converting to raw {{(pid=71628) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:34:09 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.part /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.converted {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.part /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.converted" returned: 0 in 0.139s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.converted --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33.converted --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.781s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] VM Resumed (Lifecycle Event) Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Instance spawned successfully. Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] VM Started (Lifecycle Event) Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json" returned: 0 in 0.142s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33,backing_fmt=raw /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Took 10.84 seconds to spawn the instance on the hypervisor. Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33,backing_fmt=raw /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk 1073741824" returned: 0 in 0.066s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "ed46b107528c10ba8739e8eabae3204ec12d5b33" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.211s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ed46b107528c10ba8739e8eabae3204ec12d5b33 --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:10 user nova-compute[71628]: INFO nova.compute.manager [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Took 12.21 seconds to build instance. Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5fef228b-0376-4726-83a4-5ed8685fbb05 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.400s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json" returned: 0 in 0.174s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Cannot resize image /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:10 user nova-compute[71628]: DEBUG nova.objects.instance [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lazy-loading 'migration_context' on Instance uuid f8891b6c-e3ef-450f-883b-dbfbdb74695b {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Ensure instance console log exists: /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] No waiting events found dispatching network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:11 user nova-compute[71628]: WARNING nova.compute.manager [req-ba6c9b75-1d14-4320-adf3-8f3d6c6ad193 req-e66a4d33-d8cd-43e8-8baa-2bb72ff03bf9 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received unexpected event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 for instance with vm_state active and task_state None. Apr 17 17:34:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Successfully created port: d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Successfully updated port: d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquired lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-changed-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Refreshing instance network info cache due to event network-changed-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] Acquiring lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updating instance_info_cache with network_info: [{"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Releasing lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Instance network_info: |[{"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] Acquired lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Refreshing network info cache for port d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Start _get_guest_xml network_info=[{"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:33:55Z,direct_url=,disk_format='qcow2',id=3e8f092e-58b0-4283-9790-92d661c52d35,min_disk=0,min_ram=0,name='',owner='ddf031285d5144b18c85edde02cb062e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:33:58Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'disk_bus': 'scsi', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/sda', 'encrypted': False, 'image_id': '3e8f092e-58b0-4283-9790-92d661c52d35'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:33:55Z,direct_url=,disk_format='qcow2',id=3e8f092e-58b0-4283-9790-92d661c52d35,min_disk=0,min_ram=0,name='',owner='ddf031285d5144b18c85edde02cb062e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:33:58Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T17:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-681317961',display_name='tempest-AttachSCSIVolumeTestJSON-server-681317961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-681317961',id=6,image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0rkyKp0V5SZOhxPHyUEtqECuAVMZHyYgaOG9wpVWBKD1Xh0wi6GZsBNwJBXSWnsHld+PZfVbQH52x7R9rVDPGGBwRCaULIlIQR6yfqca5udcRvzL8Ig5c9JL2pXdUfPw==',key_name='tempest-keypair-1523922560',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a6cbba34e8f449c39da5f07463fc4696',ramdisk_id='',reservation_id='r-34ik0y8e',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-173041572',owner_user_name='tempest-AttachSCSIVolumeTestJSON-173041572-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='40fcde98cd664f57a18b27bfa71111e6',uuid=f8891b6c-e3ef-450f-883b-dbfbdb74695b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converting VIF {"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lazy-loading 'pci_devices' on Instance uuid f8891b6c-e3ef-450f-883b-dbfbdb74695b {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] End _get_guest_xml xml= Apr 17 17:34:14 user nova-compute[71628]: f8891b6c-e3ef-450f-883b-dbfbdb74695b Apr 17 17:34:14 user nova-compute[71628]: instance-00000006 Apr 17 17:34:14 user nova-compute[71628]: 131072 Apr 17 17:34:14 user nova-compute[71628]: 1 Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: tempest-AttachSCSIVolumeTestJSON-server-681317961 Apr 17 17:34:14 user nova-compute[71628]: 2023-04-17 17:34:14 Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: 128 Apr 17 17:34:14 user nova-compute[71628]: 1 Apr 17 17:34:14 user nova-compute[71628]: 0 Apr 17 17:34:14 user nova-compute[71628]: 0 Apr 17 17:34:14 user nova-compute[71628]: 1 Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: tempest-AttachSCSIVolumeTestJSON-173041572-project-member Apr 17 17:34:14 user nova-compute[71628]: tempest-AttachSCSIVolumeTestJSON-173041572 Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:14 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:14 user nova-compute[71628]: 0.0.0 Apr 17 17:34:14 user nova-compute[71628]: f8891b6c-e3ef-450f-883b-dbfbdb74695b Apr 17 17:34:14 user nova-compute[71628]: f8891b6c-e3ef-450f-883b-dbfbdb74695b Apr 17 17:34:14 user nova-compute[71628]: Virtual Machine Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: hvm Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Nehalem Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]:
Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]:
Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: /dev/urandom Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: Apr 17 17:34:14 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T17:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-681317961',display_name='tempest-AttachSCSIVolumeTestJSON-server-681317961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-681317961',id=6,image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0rkyKp0V5SZOhxPHyUEtqECuAVMZHyYgaOG9wpVWBKD1Xh0wi6GZsBNwJBXSWnsHld+PZfVbQH52x7R9rVDPGGBwRCaULIlIQR6yfqca5udcRvzL8Ig5c9JL2pXdUfPw==',key_name='tempest-keypair-1523922560',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a6cbba34e8f449c39da5f07463fc4696',ramdisk_id='',reservation_id='r-34ik0y8e',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-173041572',owner_user_name='tempest-AttachSCSIVolumeTestJSON-173041572-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='40fcde98cd664f57a18b27bfa71111e6',uuid=f8891b6c-e3ef-450f-883b-dbfbdb74695b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converting VIF {"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG os_vif [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1ceb2db-ff, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd1ceb2db-ff, col_values=(('external_ids', {'iface-id': 'd1ceb2db-ffee-4a4f-88f7-fd36b41ace5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:ee:b2', 'vm-uuid': 'f8891b6c-e3ef-450f-883b-dbfbdb74695b'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:14 user nova-compute[71628]: INFO os_vif [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') Apr 17 17:34:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] No BDM found with device name sda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] No VIF found with MAC fa:16:3e:44:ee:b2, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:15 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Using config drive Apr 17 17:34:15 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Creating config drive at /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.config Apr 17 17:34:15 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp8q5vzzlc {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp8q5vzzlc" returned: 0 in 0.054s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updated VIF entry in instance network info cache for port d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updating instance_info_cache with network_info: [{"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-c0bd4481-0fb1-4f97-bb17-0f365ae0dca2 req-95afab43-a8c1-4cb3-bba7-f0a4be9c8832 service nova] Releasing lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] No waiting events found dispatching network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:18 user nova-compute[71628]: WARNING nova.compute.manager [req-d5389641-1c29-4658-b861-3c5c9bb2d84e req-dee58ddf-37e6-4800-b877-884491d1ee75 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received unexpected event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b for instance with vm_state building and task_state spawning. Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] VM Resumed (Lifecycle Event) Apr 17 17:34:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Instance spawned successfully. Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] VM Started (Lifecycle Event) Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Took 10.02 seconds to spawn the instance on the hypervisor. Apr 17 17:34:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:19 user nova-compute[71628]: INFO nova.compute.manager [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Took 10.96 seconds to build instance. Apr 17 17:34:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-02a27a0a-0904-415f-bb31-a39a820d8d86 tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.064s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] No waiting events found dispatching network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:20 user nova-compute[71628]: WARNING nova.compute.manager [req-db9c9231-cfa8-4443-95db-e4ecf97c769d req-b3baef08-8be8-4384-8f35-6dee0f86cdcf service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received unexpected event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b for instance with vm_state active and task_state None. Apr 17 17:34:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:21 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] There are 0 instances to clean {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances with incomplete migration {{(pid=71628) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:23 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:34:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:27 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:27 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=7968MB free_disk=26.525283813476562GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance ddeef235-f0ed-411b-8bf5-9a880394bb36 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance e4d62df0-41e5-4351-a4de-5c0d88a9ab5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance f2ab32f4-ce85-49d6-bf7d-a9219789a545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 0711a965-58ba-4238-aa35-b7f3d762c97d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance f8891b6c-e3ef-450f-883b-dbfbdb74695b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:34:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:34:28 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:28 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:28 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:34:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.469s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updating instance_info_cache with network_info: [{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:34:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:34:34 user nova-compute[71628]: INFO nova.compute.claims [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Claim successful on node user Apr 17 17:34:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Refreshing inventories for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Updating ProviderTree inventory for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Refreshing aggregate associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, aggregates: None {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Refreshing trait associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.540s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:34:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:34:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Creating image(s) Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "/opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "/opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "/opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG nova.policy [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '508ea3148bac4da2bb9e832a227deebe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '799065b9ead7462390f42db66b8db015', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk 1073741824" returned: 0 in 0.047s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.198s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Checking if we can resize image /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Cannot resize image /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG nova.objects.instance [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lazy-loading 'migration_context' on Instance uuid 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Ensure instance console log exists: /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:34:36 user nova-compute[71628]: INFO nova.compute.claims [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Claim successful on node user Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:34:36 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Successfully created port: 7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:34:36 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Creating image(s) Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "/opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "/opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "/opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.policy [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3fc1bd85d647d7b1eabca4bf49d42f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63c872fffe164507ab615963a791bfb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.164s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk 1073741824" returned: 0 in 0.050s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.158s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Checking if we can resize image /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Cannot resize image /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.objects.instance [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'migration_context' on Instance uuid 337c511a-a2ed-484e-ab48-31618fa2755e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Ensure instance console log exists: /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Successfully updated port: 7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquired lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-changed-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Refreshing instance network info cache due to event network-changed-7be4b617-5ccc-44ca-96b4-0b5866efaabf. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] Acquiring lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.neutron [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Updating instance_info_cache with network_info: [{"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Releasing lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Instance network_info: |[{"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] Acquired lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Refreshing network info cache for port 7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Start _get_guest_xml network_info=[{"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-2134912985',display_name='tempest-SnapshotDataIntegrityTests-server-2134912985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-2134912985',id=7,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDS5IfCIqcw54T22sSQxOb57X9YP7qwRlmDDHcPViP1FL1hJ7H/9H8+CO7VXtYU/NHQ8MQKj/82S68fnJu/F6xW2G/RRQM9yVEEfwAaawrP+Adsdzv3kxe8r/cs2d9bFlg==',key_name='tempest-SnapshotDataIntegrityTests-784024705',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='799065b9ead7462390f42db66b8db015',ramdisk_id='',reservation_id='r-wq36jbvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1490752994',owner_user_name='tempest-SnapshotDataIntegrityTests-1490752994-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:35Z,user_data=None,user_id='508ea3148bac4da2bb9e832a227deebe',uuid=82155ce4-e6ec-4ca5-a5f1-0349af7a2678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converting VIF {"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.objects.instance [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lazy-loading 'pci_devices' on Instance uuid 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] End _get_guest_xml xml= Apr 17 17:34:38 user nova-compute[71628]: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 Apr 17 17:34:38 user nova-compute[71628]: instance-00000007 Apr 17 17:34:38 user nova-compute[71628]: 131072 Apr 17 17:34:38 user nova-compute[71628]: 1 Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: tempest-SnapshotDataIntegrityTests-server-2134912985 Apr 17 17:34:38 user nova-compute[71628]: 2023-04-17 17:34:38 Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: 128 Apr 17 17:34:38 user nova-compute[71628]: 1 Apr 17 17:34:38 user nova-compute[71628]: 0 Apr 17 17:34:38 user nova-compute[71628]: 0 Apr 17 17:34:38 user nova-compute[71628]: 1 Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: tempest-SnapshotDataIntegrityTests-1490752994-project-member Apr 17 17:34:38 user nova-compute[71628]: tempest-SnapshotDataIntegrityTests-1490752994 Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:38 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:38 user nova-compute[71628]: 0.0.0 Apr 17 17:34:38 user nova-compute[71628]: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 Apr 17 17:34:38 user nova-compute[71628]: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 Apr 17 17:34:38 user nova-compute[71628]: Virtual Machine Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: hvm Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Nehalem Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: /dev/urandom Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: Apr 17 17:34:38 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-2134912985',display_name='tempest-SnapshotDataIntegrityTests-server-2134912985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-2134912985',id=7,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDS5IfCIqcw54T22sSQxOb57X9YP7qwRlmDDHcPViP1FL1hJ7H/9H8+CO7VXtYU/NHQ8MQKj/82S68fnJu/F6xW2G/RRQM9yVEEfwAaawrP+Adsdzv3kxe8r/cs2d9bFlg==',key_name='tempest-SnapshotDataIntegrityTests-784024705',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='799065b9ead7462390f42db66b8db015',ramdisk_id='',reservation_id='r-wq36jbvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1490752994',owner_user_name='tempest-SnapshotDataIntegrityTests-1490752994-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:35Z,user_data=None,user_id='508ea3148bac4da2bb9e832a227deebe',uuid=82155ce4-e6ec-4ca5-a5f1-0349af7a2678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converting VIF {"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG os_vif [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4b617-5c, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7be4b617-5c, col_values=(('external_ids', {'iface-id': '7be4b617-5ccc-44ca-96b4-0b5866efaabf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:19:bc', 'vm-uuid': '82155ce4-e6ec-4ca5-a5f1-0349af7a2678'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:38 user nova-compute[71628]: INFO os_vif [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] No VIF found with MAC fa:16:3e:8e:19:bc, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Updated VIF entry in instance network info cache for port 7be4b617-5ccc-44ca-96b4-0b5866efaabf. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Updating instance_info_cache with network_info: [{"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-50a7342a-11ac-4d4e-97ad-e897ac3b41b1 req-2ad62e27-f23b-48d0-871f-1d1c36c6869b service nova] Releasing lock "refresh_cache-82155ce4-e6ec-4ca5-a5f1-0349af7a2678" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:34:39 user nova-compute[71628]: INFO nova.compute.claims [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Claim successful on node user Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Successfully created port: 25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.527s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:39 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:34:39 user nova-compute[71628]: INFO nova.compute.claims [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Claim successful on node user Apr 17 17:34:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.policy [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c9e3172f6aeb401cbea2e81c86c614fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c589ed2b5c4abf9fab75e4c36dc3b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:34:40 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Creating image(s) Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "/opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "/opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "/opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Successfully updated port: 25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquired lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-changed-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Refreshing instance network info cache due to event network-changed-25797ef5-1350-4d57-bd16-5c59918ca955. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] Acquiring lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.167s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] No waiting events found dispatching network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:40 user nova-compute[71628]: WARNING nova.compute.manager [req-80b753e3-6fc4-408f-b1b6-26266a639e12 req-b8a9fb74-cff7-490a-acef-439f22b61622 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received unexpected event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf for instance with vm_state building and task_state spawning. Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.222s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.699s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk 1073741824" returned: 0 in 0.066s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.291s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:34:40 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.167s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Checking if we can resize image /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:34:40 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Creating image(s) Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "/opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "/opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "/opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Cannot resize image /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.objects.instance [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'migration_context' on Instance uuid 71bc60a8-8430-4110-aa0a-0141b6cf2277 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Ensure instance console log exists: /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:40 user nova-compute[71628]: DEBUG nova.policy [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d573b61dc994a1fa6343b162ac67112', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6952e4623737462a8b8f31ada0786922', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Successfully created port: 4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk 1073741824" returned: 0 in 0.073s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.232s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Releasing lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Instance network_info: |[{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] Acquired lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Refreshing network info cache for port 25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Start _get_guest_xml network_info=[{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1097947059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1097947059',id=8,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-rsq1vb6x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:37Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=337c511a-a2ed-484e-ab48-31618fa2755e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.objects.instance [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'pci_devices' on Instance uuid 337c511a-a2ed-484e-ab48-31618fa2755e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] End _get_guest_xml xml= Apr 17 17:34:41 user nova-compute[71628]: 337c511a-a2ed-484e-ab48-31618fa2755e Apr 17 17:34:41 user nova-compute[71628]: instance-00000008 Apr 17 17:34:41 user nova-compute[71628]: 131072 Apr 17 17:34:41 user nova-compute[71628]: 1 Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-server-1097947059 Apr 17 17:34:41 user nova-compute[71628]: 2023-04-17 17:34:41 Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: 128 Apr 17 17:34:41 user nova-compute[71628]: 1 Apr 17 17:34:41 user nova-compute[71628]: 0 Apr 17 17:34:41 user nova-compute[71628]: 0 Apr 17 17:34:41 user nova-compute[71628]: 1 Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member Apr 17 17:34:41 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-1793110919 Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:41 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:41 user nova-compute[71628]: 0.0.0 Apr 17 17:34:41 user nova-compute[71628]: 337c511a-a2ed-484e-ab48-31618fa2755e Apr 17 17:34:41 user nova-compute[71628]: 337c511a-a2ed-484e-ab48-31618fa2755e Apr 17 17:34:41 user nova-compute[71628]: Virtual Machine Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: hvm Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Nehalem Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: /dev/urandom Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: Apr 17 17:34:41 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1097947059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1097947059',id=8,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-rsq1vb6x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:37Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=337c511a-a2ed-484e-ab48-31618fa2755e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG os_vif [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25797ef5-13, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25797ef5-13, col_values=(('external_ids', {'iface-id': '25797ef5-1350-4d57-bd16-5c59918ca955', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:1a:f5', 'vm-uuid': '337c511a-a2ed-484e-ab48-31618fa2755e'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Checking if we can resize image /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:41 user nova-compute[71628]: INFO os_vif [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] No VIF found with MAC fa:16:3e:1c:1a:f5, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Cannot resize image /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.objects.instance [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lazy-loading 'migration_context' on Instance uuid 47d30b1a-fc08-4cad-8a2e-003b43251518 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Ensure instance console log exists: /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Successfully updated port: 4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquired lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updated VIF entry in instance network info cache for port 25797ef5-1350-4d57-bd16-5c59918ca955. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b016cd16-d07c-401f-8d4b-b58b6f8a55d4 req-c7e63f05-4c73-49fb-87ce-1e2a16e7ce80 service nova] Releasing lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] VM Resumed (Lifecycle Event) Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Instance spawned successfully. Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Successfully created port: 653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] VM Started (Lifecycle Event) Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Took 7.40 seconds to spawn the instance on the hypervisor. Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updating instance_info_cache with network_info: [{"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Releasing lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Instance network_info: |[{"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Start _get_guest_xml network_info=[{"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2061242543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2061242543',id=9,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIpnj8oCZHmjWh4/gDcz84BMNxab+jfpVXMZimwXu11yrREHMuJOd47ZIl/xfean6CCKPsZ2ZPqMHgkScca7uI2jKhP6nny/rcjVXtc4nflfFs6mX5AGKoSPNpUMg1KLOA==',key_name='tempest-keypair-1405748905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-9gwmmyo5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=71bc60a8-8430-4110-aa0a-0141b6cf2277,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.objects.instance [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'pci_devices' on Instance uuid 71bc60a8-8430-4110-aa0a-0141b6cf2277 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] End _get_guest_xml xml= Apr 17 17:34:42 user nova-compute[71628]: 71bc60a8-8430-4110-aa0a-0141b6cf2277 Apr 17 17:34:42 user nova-compute[71628]: instance-00000009 Apr 17 17:34:42 user nova-compute[71628]: 131072 Apr 17 17:34:42 user nova-compute[71628]: 1 Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-server-2061242543 Apr 17 17:34:42 user nova-compute[71628]: 2023-04-17 17:34:42 Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: 128 Apr 17 17:34:42 user nova-compute[71628]: 1 Apr 17 17:34:42 user nova-compute[71628]: 0 Apr 17 17:34:42 user nova-compute[71628]: 0 Apr 17 17:34:42 user nova-compute[71628]: 1 Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-993962804-project-member Apr 17 17:34:42 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-993962804 Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:42 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:42 user nova-compute[71628]: 0.0.0 Apr 17 17:34:42 user nova-compute[71628]: 71bc60a8-8430-4110-aa0a-0141b6cf2277 Apr 17 17:34:42 user nova-compute[71628]: 71bc60a8-8430-4110-aa0a-0141b6cf2277 Apr 17 17:34:42 user nova-compute[71628]: Virtual Machine Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: hvm Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Nehalem Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: /dev/urandom Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: Apr 17 17:34:42 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2061242543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2061242543',id=9,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIpnj8oCZHmjWh4/gDcz84BMNxab+jfpVXMZimwXu11yrREHMuJOd47ZIl/xfean6CCKPsZ2ZPqMHgkScca7uI2jKhP6nny/rcjVXtc4nflfFs6mX5AGKoSPNpUMg1KLOA==',key_name='tempest-keypair-1405748905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-9gwmmyo5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=71bc60a8-8430-4110-aa0a-0141b6cf2277,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG os_vif [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4952b9cf-93, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4952b9cf-93, col_values=(('external_ids', {'iface-id': '4952b9cf-9376-4952-9f11-0a6d6f3355a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:1b:60', 'vm-uuid': '71bc60a8-8430-4110-aa0a-0141b6cf2277'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:42 user nova-compute[71628]: INFO nova.compute.manager [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Took 8.56 seconds to build instance. Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: INFO os_vif [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-6b4fdf14-e4b4-409c-94ea-dcf4b1c1b627 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.653s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] No VIF found with MAC fa:16:3e:43:1b:60, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] No waiting events found dispatching network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:42 user nova-compute[71628]: WARNING nova.compute.manager [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received unexpected event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf for instance with vm_state active and task_state None. Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-changed-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Refreshing instance network info cache due to event network-changed-4952b9cf-9376-4952-9f11-0a6d6f3355a5. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Acquiring lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Acquired lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Refreshing network info cache for port 4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Successfully updated port: 653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquired lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-changed-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Refreshing instance network info cache due to event network-changed-653604b7-8213-4fd3-a733-26a32725aae2. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] Acquiring lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:34:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.neutron [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updated VIF entry in instance network info cache for port 4952b9cf-9376-4952-9f11-0a6d6f3355a5. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.neutron [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updating instance_info_cache with network_info: [{"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-115f426d-2158-4e97-8d10-dab4f64ccfac req-a0791d02-ad37-4578-ae65-bcbacc0459a3 service nova] Releasing lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.neutron [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updating instance_info_cache with network_info: [{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Releasing lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Instance network_info: |[{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] Acquired lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.neutron [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Refreshing network info cache for port 653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Start _get_guest_xml network_info=[{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:34:43 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:43 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-652405357',display_name='tempest-ServerActionsTestJSON-server-652405357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-652405357',id=10,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBKLOGr6Xl2ayt6JPa/BTov3dZz2x2DRGJJ0beWQ98scecwsWPA9PwlSpVMqk4AmV4xFimhLplkR3dpkkRnqF2vN+gOPfqvdsMSgfgxOtyYvO9m7kepkdN/F/4cbYypkVA==',key_name='tempest-keypair-1675299659',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6952e4623737462a8b8f31ada0786922',ramdisk_id='',reservation_id='r-dvlv99yl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1425889987',owner_user_name='tempest-ServerActionsTestJSON-1425889987-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8d573b61dc994a1fa6343b162ac67112',uuid=47d30b1a-fc08-4cad-8a2e-003b43251518,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converting VIF {"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.objects.instance [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lazy-loading 'pci_devices' on Instance uuid 47d30b1a-fc08-4cad-8a2e-003b43251518 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] End _get_guest_xml xml= Apr 17 17:34:43 user nova-compute[71628]: 47d30b1a-fc08-4cad-8a2e-003b43251518 Apr 17 17:34:43 user nova-compute[71628]: instance-0000000a Apr 17 17:34:43 user nova-compute[71628]: 131072 Apr 17 17:34:43 user nova-compute[71628]: 1 Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: tempest-ServerActionsTestJSON-server-652405357 Apr 17 17:34:43 user nova-compute[71628]: 2023-04-17 17:34:43 Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: 128 Apr 17 17:34:43 user nova-compute[71628]: 1 Apr 17 17:34:43 user nova-compute[71628]: 0 Apr 17 17:34:43 user nova-compute[71628]: 0 Apr 17 17:34:43 user nova-compute[71628]: 1 Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: tempest-ServerActionsTestJSON-1425889987-project-member Apr 17 17:34:43 user nova-compute[71628]: tempest-ServerActionsTestJSON-1425889987 Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: OpenStack Foundation Apr 17 17:34:43 user nova-compute[71628]: OpenStack Nova Apr 17 17:34:43 user nova-compute[71628]: 0.0.0 Apr 17 17:34:43 user nova-compute[71628]: 47d30b1a-fc08-4cad-8a2e-003b43251518 Apr 17 17:34:43 user nova-compute[71628]: 47d30b1a-fc08-4cad-8a2e-003b43251518 Apr 17 17:34:43 user nova-compute[71628]: Virtual Machine Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: hvm Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Nehalem Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: /dev/urandom Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: Apr 17 17:34:43 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-652405357',display_name='tempest-ServerActionsTestJSON-server-652405357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-652405357',id=10,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBKLOGr6Xl2ayt6JPa/BTov3dZz2x2DRGJJ0beWQ98scecwsWPA9PwlSpVMqk4AmV4xFimhLplkR3dpkkRnqF2vN+gOPfqvdsMSgfgxOtyYvO9m7kepkdN/F/4cbYypkVA==',key_name='tempest-keypair-1675299659',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6952e4623737462a8b8f31ada0786922',ramdisk_id='',reservation_id='r-dvlv99yl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1425889987',owner_user_name='tempest-ServerActionsTestJSON-1425889987-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:34:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8d573b61dc994a1fa6343b162ac67112',uuid=47d30b1a-fc08-4cad-8a2e-003b43251518,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converting VIF {"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG os_vif [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap653604b7-82, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap653604b7-82, col_values=(('external_ids', {'iface-id': '653604b7-8213-4fd3-a733-26a32725aae2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:a7:8c', 'vm-uuid': '47d30b1a-fc08-4cad-8a2e-003b43251518'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:43 user nova-compute[71628]: INFO os_vif [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:34:43 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] No VIF found with MAC fa:16:3e:02:a7:8c, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.network.neutron [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updated VIF entry in instance network info cache for port 653604b7-8213-4fd3-a733-26a32725aae2. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.network.neutron [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updating instance_info_cache with network_info: [{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-069c8bbf-be0c-4785-be69-6ce5613f7a9d req-fee8b33c-7254-49b0-965b-ad8caf338611 service nova] Releasing lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:44 user nova-compute[71628]: WARNING nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state building and task_state spawning. Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:44 user nova-compute[71628]: WARNING nova.compute.manager [req-7bbc5026-a513-4b82-b1cb-28b584771a08 req-87f8161c-d631-45bc-8850-0ea48eaf689c service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state building and task_state spawning. Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] No waiting events found dispatching network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:46 user nova-compute[71628]: WARNING nova.compute.manager [req-84d1fa0c-f104-450b-a3de-58f327d326a2 req-dd85c7b4-4451-44aa-9e96-70bfa8573ff9 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received unexpected event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 for instance with vm_state building and task_state spawning. Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] No waiting events found dispatching network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:46 user nova-compute[71628]: WARNING nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received unexpected event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 for instance with vm_state building and task_state spawning. Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] No waiting events found dispatching network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:46 user nova-compute[71628]: WARNING nova.compute.manager [req-d8386076-e065-4ab1-80d8-fd2f2403c12e req-c572a5c0-f585-41a6-814c-10e6488bbf50 service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received unexpected event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 for instance with vm_state building and task_state spawning. Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] VM Resumed (Lifecycle Event) Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Instance spawned successfully. Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] VM Started (Lifecycle Event) Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Took 10.38 seconds to spawn the instance on the hypervisor. Apr 17 17:34:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:47 user nova-compute[71628]: INFO nova.compute.manager [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Took 11.17 seconds to build instance. Apr 17 17:34:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-33a9f7fc-32f3-411a-ab86-35cb631eb8b0 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.275s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] No waiting events found dispatching network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:34:48 user nova-compute[71628]: WARNING nova.compute.manager [req-9208df49-5220-4acd-9115-805cf204dacf req-0155ca80-49a4-4d40-b502-f36ca6295079 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received unexpected event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 for instance with vm_state building and task_state spawning. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] VM Resumed (Lifecycle Event) Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Instance spawned successfully. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] VM Started (Lifecycle Event) Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] VM Resumed (Lifecycle Event) Apr 17 17:34:48 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Instance spawned successfully. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] VM Started (Lifecycle Event) Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Took 8.33 seconds to spawn the instance on the hypervisor. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Took 7.70 seconds to spawn the instance on the hypervisor. Apr 17 17:34:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Took 9.36 seconds to build instance. Apr 17 17:34:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-2913398a-cb59-4b3f-b6fd-52bdcdedba61 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.466s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:48 user nova-compute[71628]: INFO nova.compute.manager [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Took 8.78 seconds to build instance. Apr 17 17:34:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8d6747fe-3d18-4879-829a-27e81dfe8d80 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.916s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:34:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:34:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:35:23 user nova-compute[71628]: INFO nova.compute.claims [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Claim successful on node user Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:35:24 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.policy [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '33f713b19cdf41bc9d56ee7cea3722ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5413283bcdd4120a73a64d76459853a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:35:24 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Creating image(s) Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "/opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "/opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "/opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk 1073741824" returned: 0 in 0.047s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Successfully created port: 5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Cannot resize image /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.objects.instance [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'migration_context' on Instance uuid a3a06799-56ce-4121-93d7-e4f474afb487 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Ensure instance console log exists: /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:25 user nova-compute[71628]: INFO nova.compute.manager [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Terminating instance Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Successfully updated port: 5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquired lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-changed-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Refreshing instance network info cache due to event network-changed-5e63b915-b438-4230-9a55-9c4791efa048. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] Acquiring lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:35:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Instance destroyed successfully. Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.objects.instance [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lazy-loading 'resources' on Instance uuid ddeef235-f0ed-411b-8bf5-9a880394bb36 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1484446540',display_name='tempest-DeleteServersTestJSON-server-1484446540',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1484446540',id=2,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:33:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='618ff48e86e344939d81482da314300e',ramdisk_id='',reservation_id='r-vh6hir9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1154435592',owner_user_name='tempest-DeleteServersTestJSON-1154435592-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:33:52Z,user_data=None,user_id='cbe09b575424462398089e0895c86828',uuid=ddeef235-f0ed-411b-8bf5-9a880394bb36,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converting VIF {"id": "8ab8313d-a088-414c-9d46-1d3385707c18", "address": "fa:16:3e:d0:98:cb", "network": {"id": "fdb5560b-6924-411b-86db-443bcb6ef7f2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-282302775-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "618ff48e86e344939d81482da314300e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ab8313d-a0", "ovs_interfaceid": "8ab8313d-a088-414c-9d46-1d3385707c18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG os_vif [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ab8313d-a0, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:35:26 user nova-compute[71628]: INFO os_vif [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:98:cb,bridge_name='br-int',has_traffic_filtering=True,id=8ab8313d-a088-414c-9d46-1d3385707c18,network=Network(fdb5560b-6924-411b-86db-443bcb6ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ab8313d-a0') Apr 17 17:35:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Deleting instance files /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36_del Apr 17 17:35:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Deletion of /opt/stack/data/nova/instances/ddeef235-f0ed-411b-8bf5-9a880394bb36_del complete Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71628) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 17 17:35:26 user nova-compute[71628]: INFO nova.virt.libvirt.host [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] UEFI support detected Apr 17 17:35:26 user nova-compute[71628]: INFO nova.compute.manager [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Took 0.94 seconds to destroy the instance on the hypervisor. Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.neutron [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Updating instance_info_cache with network_info: [{"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Releasing lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Instance network_info: |[{"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] Acquired lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.neutron [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Refreshing network info cache for port 5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Start _get_guest_xml network_info=[{"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:35:26 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:26 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1441565732',display_name='tempest-ServersNegativeTestJSON-server-1441565732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1441565732',id=11,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-r0yxbsl1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:35:25Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=a3a06799-56ce-4121-93d7-e4f474afb487,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.objects.instance [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'pci_devices' on Instance uuid a3a06799-56ce-4121-93d7-e4f474afb487 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] End _get_guest_xml xml= Apr 17 17:35:26 user nova-compute[71628]: a3a06799-56ce-4121-93d7-e4f474afb487 Apr 17 17:35:26 user nova-compute[71628]: instance-0000000b Apr 17 17:35:26 user nova-compute[71628]: 131072 Apr 17 17:35:26 user nova-compute[71628]: 1 Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: tempest-ServersNegativeTestJSON-server-1441565732 Apr 17 17:35:26 user nova-compute[71628]: 2023-04-17 17:35:26 Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: 128 Apr 17 17:35:26 user nova-compute[71628]: 1 Apr 17 17:35:26 user nova-compute[71628]: 0 Apr 17 17:35:26 user nova-compute[71628]: 0 Apr 17 17:35:26 user nova-compute[71628]: 1 Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: tempest-ServersNegativeTestJSON-1842710030-project-member Apr 17 17:35:26 user nova-compute[71628]: tempest-ServersNegativeTestJSON-1842710030 Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: OpenStack Foundation Apr 17 17:35:26 user nova-compute[71628]: OpenStack Nova Apr 17 17:35:26 user nova-compute[71628]: 0.0.0 Apr 17 17:35:26 user nova-compute[71628]: a3a06799-56ce-4121-93d7-e4f474afb487 Apr 17 17:35:26 user nova-compute[71628]: a3a06799-56ce-4121-93d7-e4f474afb487 Apr 17 17:35:26 user nova-compute[71628]: Virtual Machine Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: hvm Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Nehalem Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: /dev/urandom Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: Apr 17 17:35:26 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1441565732',display_name='tempest-ServersNegativeTestJSON-server-1441565732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1441565732',id=11,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-r0yxbsl1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:35:25Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=a3a06799-56ce-4121-93d7-e4f474afb487,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG os_vif [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e63b915-b4, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5e63b915-b4, col_values=(('external_ids', {'iface-id': '5e63b915-b438-4230-9a55-9c4791efa048', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:2c:43', 'vm-uuid': 'a3a06799-56ce-4121-93d7-e4f474afb487'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:35:26 user nova-compute[71628]: INFO os_vif [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] No VIF found with MAC fa:16:3e:ab:2c:43, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.network.neutron [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Updated VIF entry in instance network info cache for port 5e63b915-b438-4230-9a55-9c4791efa048. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.network.neutron [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Updating instance_info_cache with network_info: [{"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9d5abda7-dc7d-428e-94ba-dcffde3e8039 req-fc7d4ed7-bfc3-4c35-b729-7eaa04758266 service nova] Releasing lock "refresh_cache-a3a06799-56ce-4121-93d7-e4f474afb487" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Error from libvirt while getting description of instance-00000002: [Error Code 42] Domain not found: no domain with matching uuid 'ddeef235-f0ed-411b-8bf5-9a880394bb36' (instance-00000002): libvirt.libvirtError: Domain not found: no domain with matching uuid 'ddeef235-f0ed-411b-8bf5-9a880394bb36' (instance-00000002) Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Took 0.98 seconds to deallocate network for instance. Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-a4b5e148-38b6-433c-a1f8-c5c457b9071a req-82d05612-93c2-4ca6-89b4-adeb08b4c48f service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-vif-deleted-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-vif-unplugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] No waiting events found dispatching network-vif-unplugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:27 user nova-compute[71628]: WARNING nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received unexpected event network-vif-unplugged-8ab8313d-a088-414c-9d46-1d3385707c18 for instance with vm_state deleted and task_state None. Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Acquiring lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] No waiting events found dispatching network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:27 user nova-compute[71628]: WARNING nova.compute.manager [req-5e3743b1-a6ba-4000-af83-ab5e2717fb12 req-5290d607-d9bf-46b1-ae8f-6a80709026f1 service nova] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Received unexpected event network-vif-plugged-8ab8313d-a088-414c-9d46-1d3385707c18 for instance with vm_state deleted and task_state None. Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-changed-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Refreshing instance network info cache due to event network-changed-f9b1ac87-92c9-4ca2-9721-54337c3c8811. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] Acquiring lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] Acquired lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Refreshing network info cache for port f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.630s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Deleted allocations for instance ddeef235-f0ed-411b-8bf5-9a880394bb36 Apr 17 17:35:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0610bf8b-3607-47d2-8bca-f04acbb91982 tempest-DeleteServersTestJSON-1154435592 tempest-DeleteServersTestJSON-1154435592-project-member] Lock "ddeef235-f0ed-411b-8bf5-9a880394bb36" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.967s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG nova.network.neutron [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updated VIF entry in instance network info cache for port f9b1ac87-92c9-4ca2-9721-54337c3c8811. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG nova.network.neutron [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updating instance_info_cache with network_info: [{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.43", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-47e02686-f43e-4db3-b37d-2bdb2748777c req-ddbed735-a4cb-4a96-a714-a8de24c20379 service nova] Releasing lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] VM Resumed (Lifecycle Event) Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-changed-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Refreshing instance network info cache due to event network-changed-725be64e-c050-49d6-a87d-5cb5b04e86c0. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] Acquiring lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] Acquired lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.network.neutron [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Refreshing network info cache for port 725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] No waiting events found dispatching network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:30 user nova-compute[71628]: WARNING nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received unexpected event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 for instance with vm_state building and task_state spawning. Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] No waiting events found dispatching network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:30 user nova-compute[71628]: WARNING nova.compute.manager [req-38f8286c-4668-4140-8bff-8a520bf51d66 req-f6638781-a2ca-4ffb-8086-1c5e197977bd service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received unexpected event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 for instance with vm_state building and task_state spawning. Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:30 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:30 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=7801MB free_disk=26.410659790039062GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Instance spawned successfully. Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Terminating instance Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] VM Started (Lifecycle Event) Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:35:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Took 6.38 seconds to spawn the instance on the hypervisor. Apr 17 17:35:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:30 user nova-compute[71628]: INFO nova.compute.manager [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] instance snapshotting Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance e4d62df0-41e5-4351-a4de-5c0d88a9ab5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance f2ab32f4-ce85-49d6-bf7d-a9219789a545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 0711a965-58ba-4238-aa35-b7f3d762c97d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance f8891b6c-e3ef-450f-883b-dbfbdb74695b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 71bc60a8-8430-4110-aa0a-0141b6cf2277 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance a3a06799-56ce-4121-93d7-e4f474afb487 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 10 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1792MB phys_disk=40GB used_disk=10GB total_vcpus=12 used_vcpus=10 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:35:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Beginning live snapshot process Apr 17 17:35:31 user nova-compute[71628]: INFO nova.compute.manager [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Took 7.21 seconds to build instance. Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cbb885aa-e1d1-43fd-bd59-57836b793672 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.313s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json -f qcow2" returned: 0 in 0.142s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.691s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json -f qcow2" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Instance destroyed successfully. Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.objects.instance [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'resources' on Instance uuid f2ab32f4-ce85-49d6-bf7d-a9219789a545 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1291197301',display_name='tempest-AttachVolumeNegativeTest-server-1291197301',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1291197301',id=4,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETy3qN91fAUY59vYTnM+st5lBmsgrYGghrdiDmNZuBShFM/gMR4GlDzNicctF6dMRMbYWda4SIaaWAx7hCS/iUHMA0EEUO+HKkLWyI2QTVq0VaormimlIiLwEnxEYg/qQ==',key_name='tempest-keypair-328622868',keypairs=,launch_index=0,launched_at=2023-04-17T17:33:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-uj0nocan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:33:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=f2ab32f4-ce85-49d6-bf7d-a9219789a545,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG os_vif [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap725be64e-c0, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.network.neutron [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updated VIF entry in instance network info cache for port 725be64e-c050-49d6-a87d-5cb5b04e86c0. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.network.neutron [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updating instance_info_cache with network_info: [{"id": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "address": "fa:16:3e:d7:2e:09", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap725be64e-c0", "ovs_interfaceid": "725be64e-c050-49d6-a87d-5cb5b04e86c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:35:31 user nova-compute[71628]: INFO os_vif [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2e:09,bridge_name='br-int',has_traffic_filtering=True,id=725be64e-c050-49d6-a87d-5cb5b04e86c0,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap725be64e-c0') Apr 17 17:35:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Deleting instance files /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545_del Apr 17 17:35:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Deletion of /opt/stack/data/nova/instances/f2ab32f4-ce85-49d6-bf7d-a9219789a545_del complete Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-34a935e9-f5d2-4b43-8bcc-556864f32b4d req-7098fd0e-0666-4c41-8164-7b75887dd7d1 service nova] Releasing lock "refresh_cache-f2ab32f4-ce85-49d6-bf7d-a9219789a545" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e.delta 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e.delta 1073741824" returned: 0 in 0.102s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 17:35:31 user nova-compute[71628]: INFO nova.compute.manager [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Took 1.01 seconds to destroy the instance on the hypervisor. Apr 17 17:35:31 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:35:31 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-unplugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] No waiting events found dispatching network-vif-unplugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-unplugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Acquiring lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] No waiting events found dispatching network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:32 user nova-compute[71628]: WARNING nova.compute.manager [req-a75c8505-8516-46ba-b7c4-e21927840bc7 req-f9f891b9-e93f-4501-aefe-543abca16376 service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received unexpected event network-vif-plugged-725be64e-c050-49d6-a87d-5cb5b04e86c0 for instance with vm_state active and task_state deleting. Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:32 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Took 0.77 seconds to deallocate network for instance. Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-354c506e-fb1d-4422-8394-76993a1fd420 req-14ed3e75-fb45-4d94-9f54-5a1379f4b25a service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Received event network-vif-deleted-725be64e-c050-49d6-a87d-5cb5b04e86c0 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:32 user nova-compute[71628]: INFO nova.compute.manager [req-354c506e-fb1d-4422-8394-76993a1fd420 req-14ed3e75-fb45-4d94-9f54-5a1379f4b25a service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Neutron deleted interface 725be64e-c050-49d6-a87d-5cb5b04e86c0; detaching it from the instance and deleting it from the info cache Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.network.neutron [req-354c506e-fb1d-4422-8394-76993a1fd420 req-14ed3e75-fb45-4d94-9f54-5a1379f4b25a service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-354c506e-fb1d-4422-8394-76993a1fd420 req-14ed3e75-fb45-4d94-9f54-5a1379f4b25a service nova] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Detach interface failed, port_id=725be64e-c050-49d6-a87d-5cb5b04e86c0, reason: Instance f2ab32f4-ce85-49d6-bf7d-a9219789a545 could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:35:32 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.365s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:35:33 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 17:35:33 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Deleted allocations for instance f2ab32f4-ce85-49d6-bf7d-a9219789a545 Apr 17 17:35:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updating instance_info_cache with network_info: [{"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.43", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e.delta /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a0648d7a-e012-4223-86fa-6d795f533325 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "f2ab32f4-ce85-49d6-bf7d-a9219789a545" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.416s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e.delta /opt/stack/data/nova/instances/snapshots/tmpu6g_9f_4/bde0519f40af496a9227c606af92590e" returned: 0 in 0.292s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:33 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Snapshot extracted, beginning image upload Apr 17 17:35:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Snapshot image upload complete Apr 17 17:35:35 user nova-compute[71628]: INFO nova.compute.manager [None req-c8aba0b5-7517-45b7-83df-d9758117cc41 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Took 4.75 seconds to snapshot the instance on the hypervisor. Apr 17 17:35:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:41 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:41 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] VM Stopped (Lifecycle Event) Apr 17 17:35:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b3c8f9a1-793a-4ac1-9a4d-f8398c34ce81 None None] [instance: ddeef235-f0ed-411b-8bf5-9a880394bb36] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:46 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:46 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] VM Stopped (Lifecycle Event) Apr 17 17:35:46 user nova-compute[71628]: DEBUG nova.compute.manager [None req-4ac2723d-5b97-43d4-96f5-81ea3a16a952 None None] [instance: f2ab32f4-ce85-49d6-bf7d-a9219789a545] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:49 user nova-compute[71628]: DEBUG nova.compute.manager [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-changed-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:49 user nova-compute[71628]: DEBUG nova.compute.manager [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Refreshing instance network info cache due to event network-changed-358fa886-02f3-433a-a1af-d4d2bff8be35. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:35:49 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] Acquiring lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:49 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] Acquired lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:49 user nova-compute[71628]: DEBUG nova.network.neutron [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Refreshing network info cache for port 358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:35:50 user nova-compute[71628]: DEBUG nova.network.neutron [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updated VIF entry in instance network info cache for port 358fa886-02f3-433a-a1af-d4d2bff8be35. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:35:50 user nova-compute[71628]: DEBUG nova.network.neutron [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updating instance_info_cache with network_info: [{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-38dc2b16-96e7-4b40-b082-78e600cc7a53 req-11d28b5a-a3a4-4dd3-8010-0ac31bd13565 service nova] Releasing lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:51 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:35:52 user nova-compute[71628]: INFO nova.compute.claims [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Claim successful on node user Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:35:52 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:35:52 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.policy [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb9f6038c3d94f4b8176f52308996012', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd1e8586846543c88d468bb6b705d4a6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:35:53 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Creating image(s) Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "/opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "/opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "/opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk 1073741824" returned: 0 in 0.047s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.153s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Checking if we can resize image /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Successfully created port: 5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Cannot resize image /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'migration_context' on Instance uuid 35fe8580-9a68-44c2-8b86-9c28144bd2f1 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Ensure instance console log exists: /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Successfully updated port: 5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquired lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.compute.manager [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-changed-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.compute.manager [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Refreshing instance network info cache due to event network-changed-5bf971a6-cc65-49ba-a2d2-4bb6ac641771. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] Acquiring lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Updating instance_info_cache with network_info: [{"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Releasing lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Instance network_info: |[{"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] Acquired lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.neutron [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Refreshing network info cache for port 5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Start _get_guest_xml network_info=[{"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:35:54 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:54 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-996763478',display_name='tempest-VolumesAdminNegativeTest-server-996763478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-996763478',id=12,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-4t0vnoog',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:35:53Z,user_data=None,user_id='cb9f6038c3d94f4b8176f52308996012',uuid=35fe8580-9a68-44c2-8b86-9c28144bd2f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'pci_devices' on Instance uuid 35fe8580-9a68-44c2-8b86-9c28144bd2f1 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] End _get_guest_xml xml= Apr 17 17:35:54 user nova-compute[71628]: 35fe8580-9a68-44c2-8b86-9c28144bd2f1 Apr 17 17:35:54 user nova-compute[71628]: instance-0000000c Apr 17 17:35:54 user nova-compute[71628]: 131072 Apr 17 17:35:54 user nova-compute[71628]: 1 Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-server-996763478 Apr 17 17:35:54 user nova-compute[71628]: 2023-04-17 17:35:54 Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: 128 Apr 17 17:35:54 user nova-compute[71628]: 1 Apr 17 17:35:54 user nova-compute[71628]: 0 Apr 17 17:35:54 user nova-compute[71628]: 0 Apr 17 17:35:54 user nova-compute[71628]: 1 Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-1858597906-project-member Apr 17 17:35:54 user nova-compute[71628]: tempest-VolumesAdminNegativeTest-1858597906 Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: OpenStack Foundation Apr 17 17:35:54 user nova-compute[71628]: OpenStack Nova Apr 17 17:35:54 user nova-compute[71628]: 0.0.0 Apr 17 17:35:54 user nova-compute[71628]: 35fe8580-9a68-44c2-8b86-9c28144bd2f1 Apr 17 17:35:54 user nova-compute[71628]: 35fe8580-9a68-44c2-8b86-9c28144bd2f1 Apr 17 17:35:54 user nova-compute[71628]: Virtual Machine Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: hvm Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Nehalem Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: /dev/urandom Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: Apr 17 17:35:54 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-996763478',display_name='tempest-VolumesAdminNegativeTest-server-996763478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-996763478',id=12,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-4t0vnoog',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:35:53Z,user_data=None,user_id='cb9f6038c3d94f4b8176f52308996012',uuid=35fe8580-9a68-44c2-8b86-9c28144bd2f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG os_vif [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bf971a6-cc, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5bf971a6-cc, col_values=(('external_ids', {'iface-id': '5bf971a6-cc65-49ba-a2d2-4bb6ac641771', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:11:e4', 'vm-uuid': '35fe8580-9a68-44c2-8b86-9c28144bd2f1'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:54 user nova-compute[71628]: INFO os_vif [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:35:54 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] No VIF found with MAC fa:16:3e:df:11:e4, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:35:55 user nova-compute[71628]: DEBUG nova.network.neutron [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Updated VIF entry in instance network info cache for port 5bf971a6-cc65-49ba-a2d2-4bb6ac641771. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:35:55 user nova-compute[71628]: DEBUG nova.network.neutron [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Updating instance_info_cache with network_info: [{"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:35:55 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cd26c692-8d5b-4f5b-9d1b-77989de8b97c req-0e0af14f-a781-4a44-b7ea-4c5f3edcca50 service nova] Releasing lock "refresh_cache-35fe8580-9a68-44c2-8b86-9c28144bd2f1" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG nova.compute.manager [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:56 user nova-compute[71628]: DEBUG nova.compute.manager [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] No waiting events found dispatching network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:56 user nova-compute[71628]: WARNING nova.compute.manager [req-925bb752-7651-4f12-99eb-1d280a29a29e req-e5c14795-0731-418f-9014-cd2c044ee6a2 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received unexpected event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 for instance with vm_state building and task_state spawning. Apr 17 17:35:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] VM Resumed (Lifecycle Event) Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:35:58 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Instance spawned successfully. Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] VM Started (Lifecycle Event) Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Took 5.23 seconds to spawn the instance on the hypervisor. Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:58 user nova-compute[71628]: DEBUG nova.compute.manager [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] No waiting events found dispatching network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:35:58 user nova-compute[71628]: WARNING nova.compute.manager [req-f224ed26-30b6-4853-b1c7-d3afdcd97a89 req-a6f75a83-cb31-40f0-8028-fb20fe201b19 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received unexpected event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 for instance with vm_state building and task_state spawning. Apr 17 17:35:58 user nova-compute[71628]: INFO nova.compute.manager [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Took 6.09 seconds to build instance. Apr 17 17:35:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b8d006cb-4032-4d3c-8346-706d2145423a tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.181s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:35:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG nova.compute.manager [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-changed-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG nova.compute.manager [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Refreshing instance network info cache due to event network-changed-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] Acquiring lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] Acquired lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG nova.network.neutron [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Refreshing network info cache for port d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG nova.network.neutron [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updated VIF entry in instance network info cache for port d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG nova.network.neutron [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updating instance_info_cache with network_info: [{"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:00 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1c442952-cd55-490b-bb77-33272147fabe req-c1c72fb8-5e94-4f86-8fcc-b4d7dbf0e797 service nova] Releasing lock "refresh_cache-f8891b6c-e3ef-450f-883b-dbfbdb74695b" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:01 user nova-compute[71628]: INFO nova.compute.manager [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Terminating instance Apr 17 17:36:01 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.compute.manager [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-unplugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.compute.manager [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] No waiting events found dispatching network-vif-unplugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.compute.manager [req-df403f32-ba94-4a68-b402-56842727c46e req-54f27350-1eac-46e0-ab88-c05fec7b4b5d service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-unplugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:02 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Instance destroyed successfully. Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.objects.instance [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lazy-loading 'resources' on Instance uuid f8891b6c-e3ef-450f-883b-dbfbdb74695b {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T17:34:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-681317961',display_name='tempest-AttachSCSIVolumeTestJSON-server-681317961',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-681317961',id=6,image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0rkyKp0V5SZOhxPHyUEtqECuAVMZHyYgaOG9wpVWBKD1Xh0wi6GZsBNwJBXSWnsHld+PZfVbQH52x7R9rVDPGGBwRCaULIlIQR6yfqca5udcRvzL8Ig5c9JL2pXdUfPw==',key_name='tempest-keypair-1523922560',keypairs=,launch_index=0,launched_at=2023-04-17T17:34:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a6cbba34e8f449c39da5f07463fc4696',ramdisk_id='',reservation_id='r-34ik0y8e',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e8f092e-58b0-4283-9790-92d661c52d35',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-173041572',owner_user_name='tempest-AttachSCSIVolumeTestJSON-173041572-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:34:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='40fcde98cd664f57a18b27bfa71111e6',uuid=f8891b6c-e3ef-450f-883b-dbfbdb74695b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converting VIF {"id": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "address": "fa:16:3e:44:ee:b2", "network": {"id": "9768e882-d09d-4110-8ef9-1b7a3d6797f1", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1567452796-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a6cbba34e8f449c39da5f07463fc4696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd1ceb2db-ff", "ovs_interfaceid": "d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG os_vif [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1ceb2db-ff, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:02 user nova-compute[71628]: INFO os_vif [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:ee:b2,bridge_name='br-int',has_traffic_filtering=True,id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b,network=Network(9768e882-d09d-4110-8ef9-1b7a3d6797f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd1ceb2db-ff') Apr 17 17:36:02 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Deleting instance files /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b_del Apr 17 17:36:02 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Deletion of /opt/stack/data/nova/instances/f8891b6c-e3ef-450f-883b-dbfbdb74695b_del complete Apr 17 17:36:02 user nova-compute[71628]: INFO nova.compute.manager [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 17 17:36:02 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:36:02 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:03 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Took 0.82 seconds to deallocate network for instance. Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-28d86d37-219a-4088-ba8f-19a89afd3f5c req-c60c073e-c1da-4e99-a925-d5d403fa0a20 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-deleted-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:03 user nova-compute[71628]: INFO nova.compute.manager [req-28d86d37-219a-4088-ba8f-19a89afd3f5c req-c60c073e-c1da-4e99-a925-d5d403fa0a20 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Neutron deleted interface d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b; detaching it from the instance and deleting it from the info cache Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.network.neutron [req-28d86d37-219a-4088-ba8f-19a89afd3f5c req-c60c073e-c1da-4e99-a925-d5d403fa0a20 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-28d86d37-219a-4088-ba8f-19a89afd3f5c req-c60c073e-c1da-4e99-a925-d5d403fa0a20 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Detach interface failed, port_id=d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b, reason: Instance f8891b6c-e3ef-450f-883b-dbfbdb74695b could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:36:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.367s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:03 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Deleted allocations for instance f8891b6c-e3ef-450f-883b-dbfbdb74695b Apr 17 17:36:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a96f462-e661-42e3-9483-410e58d718cc tempest-AttachSCSIVolumeTestJSON-173041572 tempest-AttachSCSIVolumeTestJSON-173041572-project-member] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.248s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:04 user nova-compute[71628]: DEBUG nova.compute.manager [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] Acquiring lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] Lock "f8891b6c-e3ef-450f-883b-dbfbdb74695b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:04 user nova-compute[71628]: DEBUG nova.compute.manager [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] No waiting events found dispatching network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:04 user nova-compute[71628]: WARNING nova.compute.manager [req-dea4a134-4a17-41a9-bc14-1ba8add2ac36 req-7d592cd2-3df6-498c-abb3-6b170c5986d5 service nova] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Received unexpected event network-vif-plugged-d1ceb2db-ffee-4a4f-88f7-fd36b41ace5b for instance with vm_state deleted and task_state None. Apr 17 17:36:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:17 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:36:17 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] VM Stopped (Lifecycle Event) Apr 17 17:36:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:17 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a60c027a-a422-4bf5-8da2-864abceb3cbb None None] [instance: f8891b6c-e3ef-450f-883b-dbfbdb74695b] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:36:25 user nova-compute[71628]: INFO nova.compute.claims [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Claim successful on node user Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.402s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:25 user nova-compute[71628]: INFO nova.compute.manager [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Terminating instance Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:36:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:36:25 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:36:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Creating image(s) Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "/opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.policy [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cbcda57441d43e0bb8dfee4768df2a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70d10a379e4e420e9c66476ae0b10507', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.160s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk 1073741824" returned: 0 in 0.051s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-unplugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] No waiting events found dispatching network-vif-unplugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [req-e58ebe88-fc39-4913-9cdb-7a611284ff3c req-f5b0429d-0d24-41b8-9d0d-b71105340e8c service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-unplugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Checking if we can resize image /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Instance destroyed successfully. Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.objects.instance [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lazy-loading 'resources' on Instance uuid 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-2134912985',display_name='tempest-SnapshotDataIntegrityTests-server-2134912985',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-2134912985',id=7,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDS5IfCIqcw54T22sSQxOb57X9YP7qwRlmDDHcPViP1FL1hJ7H/9H8+CO7VXtYU/NHQ8MQKj/82S68fnJu/F6xW2G/RRQM9yVEEfwAaawrP+Adsdzv3kxe8r/cs2d9bFlg==',key_name='tempest-SnapshotDataIntegrityTests-784024705',keypairs=,launch_index=0,launched_at=2023-04-17T17:34:42Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='799065b9ead7462390f42db66b8db015',ramdisk_id='',reservation_id='r-wq36jbvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-1490752994',owner_user_name='tempest-SnapshotDataIntegrityTests-1490752994-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:34:42Z,user_data=None,user_id='508ea3148bac4da2bb9e832a227deebe',uuid=82155ce4-e6ec-4ca5-a5f1-0349af7a2678,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converting VIF {"id": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "address": "fa:16:3e:8e:19:bc", "network": {"id": "f4d56e40-4b5d-4ba1-9502-325184023eaa", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-94423770-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "799065b9ead7462390f42db66b8db015", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7be4b617-5c", "ovs_interfaceid": "7be4b617-5ccc-44ca-96b4-0b5866efaabf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG os_vif [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4b617-5c, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:36:26 user nova-compute[71628]: INFO os_vif [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:19:bc,bridge_name='br-int',has_traffic_filtering=True,id=7be4b617-5ccc-44ca-96b4-0b5866efaabf,network=Network(f4d56e40-4b5d-4ba1-9502-325184023eaa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7be4b617-5c') Apr 17 17:36:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Deleting instance files /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678_del Apr 17 17:36:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Deletion of /opt/stack/data/nova/instances/82155ce4-e6ec-4ca5-a5f1-0349af7a2678_del complete Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Cannot resize image /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.objects.instance [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'migration_context' on Instance uuid 724dac7a-d0c4-47c5-9faf-c32e8cab0459 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Ensure instance console log exists: /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:26 user nova-compute[71628]: INFO nova.compute.manager [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Took 0.97 seconds to destroy the instance on the hypervisor. Apr 17 17:36:26 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:36:26 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-c7165b3f-f0e6-4c7b-8704-f0dda2da4a80 req-a83234f7-74e7-4bdb-9e45-b43cfc25c54e service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-deleted-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:27 user nova-compute[71628]: INFO nova.compute.manager [req-c7165b3f-f0e6-4c7b-8704-f0dda2da4a80 req-a83234f7-74e7-4bdb-9e45-b43cfc25c54e service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Neutron deleted interface 7be4b617-5ccc-44ca-96b4-0b5866efaabf; detaching it from the instance and deleting it from the info cache Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.network.neutron [req-c7165b3f-f0e6-4c7b-8704-f0dda2da4a80 req-a83234f7-74e7-4bdb-9e45-b43cfc25c54e service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:27 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Took 0.58 seconds to deallocate network for instance. Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-c7165b3f-f0e6-4c7b-8704-f0dda2da4a80 req-a83234f7-74e7-4bdb-9e45-b43cfc25c54e service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Detach interface failed, port_id=7be4b617-5ccc-44ca-96b4-0b5866efaabf, reason: Instance 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.187s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Successfully created port: 69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:27 user nova-compute[71628]: INFO nova.compute.manager [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] instance snapshotting Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:27 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Beginning live snapshot process Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.517s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:28 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Deleted allocations for instance 82155ce4-e6ec-4ca5-a5f1-0349af7a2678 Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cfaa0019-8fb9-4966-97f2-b66d70cedc92 tempest-SnapshotDataIntegrityTests-1490752994 tempest-SnapshotDataIntegrityTests-1490752994-project-member] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.292s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json -f qcow2" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Successfully updated port: 69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquired lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json -f qcow2" returned: 0 in 0.145s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7.delta 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Acquiring lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Lock "82155ce4-e6ec-4ca5-a5f1-0349af7a2678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] No waiting events found dispatching network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:28 user nova-compute[71628]: WARNING nova.compute.manager [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Received unexpected event network-vif-plugged-7be4b617-5ccc-44ca-96b4-0b5866efaabf for instance with vm_state deleted and task_state None. Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-changed-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.manager [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Refreshing instance network info cache due to event network-changed-69672cf1-09b2-4035-8125-023e26e1c6f6. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Acquiring lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7.delta 1073741824" returned: 0 in 0.063s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updating instance_info_cache with network_info: [{"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Releasing lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Instance network_info: |[{"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Acquired lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Refreshing network info cache for port 69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Start _get_guest_xml network_info=[{"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:36:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:36:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:36:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:36:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1425938846',display_name='tempest-AttachVolumeNegativeTest-server-1425938846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1425938846',id=13,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPYGM7TCIEea2NrfYqx3cHOOTL3mi2XVT7f+gq/ZodBX91kDRxNKVeDpbp07ToQ/gejuEPAZmv04W2fC3xC4FAc2yfJETAIg24f1z/1RTRoS+gxfXI31WCaXi4xYCRgVA==',key_name='tempest-keypair-466913357',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-adk0pmn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:36:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=724dac7a-d0c4-47c5-9faf-c32e8cab0459,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.objects.instance [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'pci_devices' on Instance uuid 724dac7a-d0c4-47c5-9faf-c32e8cab0459 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json" returned: 0 in 0.179s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] End _get_guest_xml xml= Apr 17 17:36:29 user nova-compute[71628]: 724dac7a-d0c4-47c5-9faf-c32e8cab0459 Apr 17 17:36:29 user nova-compute[71628]: instance-0000000d Apr 17 17:36:29 user nova-compute[71628]: 131072 Apr 17 17:36:29 user nova-compute[71628]: 1 Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-server-1425938846 Apr 17 17:36:29 user nova-compute[71628]: 2023-04-17 17:36:28 Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: 128 Apr 17 17:36:29 user nova-compute[71628]: 1 Apr 17 17:36:29 user nova-compute[71628]: 0 Apr 17 17:36:29 user nova-compute[71628]: 0 Apr 17 17:36:29 user nova-compute[71628]: 1 Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846-project-member Apr 17 17:36:29 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846 Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: OpenStack Foundation Apr 17 17:36:29 user nova-compute[71628]: OpenStack Nova Apr 17 17:36:29 user nova-compute[71628]: 0.0.0 Apr 17 17:36:29 user nova-compute[71628]: 724dac7a-d0c4-47c5-9faf-c32e8cab0459 Apr 17 17:36:29 user nova-compute[71628]: 724dac7a-d0c4-47c5-9faf-c32e8cab0459 Apr 17 17:36:29 user nova-compute[71628]: Virtual Machine Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: hvm Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Nehalem Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: /dev/urandom Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: Apr 17 17:36:29 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:36:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1425938846',display_name='tempest-AttachVolumeNegativeTest-server-1425938846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1425938846',id=13,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPYGM7TCIEea2NrfYqx3cHOOTL3mi2XVT7f+gq/ZodBX91kDRxNKVeDpbp07ToQ/gejuEPAZmv04W2fC3xC4FAc2yfJETAIg24f1z/1RTRoS+gxfXI31WCaXi4xYCRgVA==',key_name='tempest-keypair-466913357',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-adk0pmn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:36:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=724dac7a-d0c4-47c5-9faf-c32e8cab0459,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG os_vif [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69672cf1-09, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69672cf1-09, col_values=(('external_ids', {'iface-id': '69672cf1-09b2-4035-8125-023e26e1c6f6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:d9:2c', 'vm-uuid': '724dac7a-d0c4-47c5-9faf-c32e8cab0459'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:29 user nova-compute[71628]: INFO os_vif [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No VIF found with MAC fa:16:3e:2b:d9:2c, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.neutron [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updated VIF entry in instance network info cache for port 69672cf1-09b2-4035-8125-023e26e1c6f6. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG nova.network.neutron [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updating instance_info_cache with network_info: [{"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-009f0cea-5a98-40ac-ae16-7a3cbf15d9c0 req-b32b15a2-8c23-4f0a-9fb4-06132f38cef3 service nova] Releasing lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] VM Resumed (Lifecycle Event) Apr 17 17:36:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-changed-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Refreshing instance network info cache due to event network-changed-4952b9cf-9376-4952-9f11-0a6d6f3355a5. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] Acquiring lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] Acquired lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.network.neutron [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Refreshing network info cache for port 4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-changed-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Refreshing instance network info cache due to event network-changed-653604b7-8213-4fd3-a733-26a32725aae2. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] Acquiring lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] Acquired lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.network.neutron [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Refreshing network info cache for port 653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] No waiting events found dispatching network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:33 user nova-compute[71628]: WARNING nova.compute.manager [req-3c323fb8-6cb7-4ee5-8fd7-bea22508cff4 req-c6225e78-de08-4fb3-af55-fd19f29905d5 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received unexpected event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 for instance with vm_state building and task_state spawning. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] No waiting events found dispatching network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:33 user nova-compute[71628]: WARNING nova.compute.manager [req-2b43f0bd-5f30-4c04-8866-b68a9e2be7c6 req-e627d7f6-6a4b-4dcc-b9f0-c3ae720f366c service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received unexpected event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 for instance with vm_state building and task_state spawning. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:36:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:36:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8035MB free_disk=26.3941650390625GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Instance spawned successfully. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] VM Started (Lifecycle Event) Apr 17 17:36:33 user nova-compute[71628]: INFO nova.compute.manager [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Terminating instance Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:36:33 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:36:33 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:36:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7.delta /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance e4d62df0-41e5-4351-a4de-5c0d88a9ab5f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 0711a965-58ba-4238-aa35-b7f3d762c97d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 71bc60a8-8430-4110-aa0a-0141b6cf2277 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance a3a06799-56ce-4121-93d7-e4f474afb487 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 35fe8580-9a68-44c2-8b86-9c28144bd2f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 724dac7a-d0c4-47c5-9faf-c32e8cab0459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 9 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1664MB phys_disk=40GB used_disk=9GB total_vcpus=12 used_vcpus=9 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:36:34 user nova-compute[71628]: INFO nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Took 7.92 seconds to spawn the instance on the hypervisor. Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:34 user nova-compute[71628]: INFO nova.compute.manager [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Took 8.79 seconds to build instance. Apr 17 17:36:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7.delta /opt/stack/data/nova/instances/snapshots/tmp6vl28h1u/6c358e218f334eb6a6d9e1af61dfa0e7" returned: 0 in 0.280s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:36:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Snapshot extracted, beginning image upload Apr 17 17:36:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a2f17701-8990-4ec3-9d51-92b0c5a6b746 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.894s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.669s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.network.neutron [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updated VIF entry in instance network info cache for port 653604b7-8213-4fd3-a733-26a32725aae2. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.network.neutron [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updating instance_info_cache with network_info: [{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9ed01576-9fd2-4d50-a137-4922efc7c7a6 req-b0426219-8704-4302-b230-7d1ceb1f9208 service nova] Releasing lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Instance destroyed successfully. Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.objects.instance [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'resources' on Instance uuid 71bc60a8-8430-4110-aa0a-0141b6cf2277 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2061242543',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2061242543',id=9,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIpnj8oCZHmjWh4/gDcz84BMNxab+jfpVXMZimwXu11yrREHMuJOd47ZIl/xfean6CCKPsZ2ZPqMHgkScca7uI2jKhP6nny/rcjVXtc4nflfFs6mX5AGKoSPNpUMg1KLOA==',key_name='tempest-keypair-1405748905',keypairs=,launch_index=0,launched_at=2023-04-17T17:34:48Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-9gwmmyo5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:34:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=71bc60a8-8430-4110-aa0a-0141b6cf2277,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG os_vif [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4952b9cf-93, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:34 user nova-compute[71628]: INFO os_vif [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:1b:60,bridge_name='br-int',has_traffic_filtering=True,id=4952b9cf-9376-4952-9f11-0a6d6f3355a5,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4952b9cf-93') Apr 17 17:36:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Deleting instance files /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277_del Apr 17 17:36:34 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Deletion of /opt/stack/data/nova/instances/71bc60a8-8430-4110-aa0a-0141b6cf2277_del complete Apr 17 17:36:34 user nova-compute[71628]: INFO nova.compute.manager [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 17 17:36:34 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:36:34 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.network.neutron [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updated VIF entry in instance network info cache for port 4952b9cf-9376-4952-9f11-0a6d6f3355a5. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.network.neutron [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updating instance_info_cache with network_info: [{"id": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "address": "fa:16:3e:43:1b:60", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4952b9cf-93", "ovs_interfaceid": "4952b9cf-9376-4952-9f11-0a6d6f3355a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b5dca164-85a3-444d-ad42-e24524dc4ca3 req-c10b392c-7eff-4c7c-915a-18dfcf7a4b9d service nova] Releasing lock "refresh_cache-71bc60a8-8430-4110-aa0a-0141b6cf2277" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-unplugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] No waiting events found dispatching network-vif-unplugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-unplugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Acquiring lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] No waiting events found dispatching network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:36:35 user nova-compute[71628]: WARNING nova.compute.manager [req-ea5dc44d-eff5-4938-977c-792e20adb5e1 req-fcbdac95-741f-49d1-a727-5f3136830ffc service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received unexpected event network-vif-plugged-4952b9cf-9376-4952-9f11-0a6d6f3355a5 for instance with vm_state active and task_state deleting. Apr 17 17:36:36 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:36 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Took 1.32 seconds to deallocate network for instance. Apr 17 17:36:36 user nova-compute[71628]: DEBUG nova.compute.manager [req-ab227435-e025-4ec0-acb5-7f9a68c071cb req-e703b2c3-aad4-43c6-bfee-d098b53612bb service nova] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Received event network-vif-deleted-4952b9cf-9376-4952-9f11-0a6d6f3355a5 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:36:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:36:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:36:36 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:36:36 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:36:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.354s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:36 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Deleted allocations for instance 71bc60a8-8430-4110-aa0a-0141b6cf2277 Apr 17 17:36:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a4c5f46b-f743-4712-80fb-e4a4e6e556a9 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "71bc60a8-8430-4110-aa0a-0141b6cf2277" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.729s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:36:36 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Snapshot image upload complete Apr 17 17:36:36 user nova-compute[71628]: INFO nova.compute.manager [None req-33c3ef12-583a-4b4b-91a7-08ae7901dbd3 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Took 8.97 seconds to snapshot the instance on the hypervisor. Apr 17 17:36:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:36:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:36:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:36:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:36:38 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updating instance_info_cache with network_info: [{"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:36:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-0711a965-58ba-4238-aa35-b7f3d762c97d" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:36:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:36:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:36:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:41 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:36:41 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] VM Stopped (Lifecycle Event) Apr 17 17:36:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-abc21aad-8371-40ff-9674-56d7a8f2b3bf None None] [instance: 82155ce4-e6ec-4ca5-a5f1-0349af7a2678] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:49 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:36:49 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] VM Stopped (Lifecycle Event) Apr 17 17:36:49 user nova-compute[71628]: DEBUG nova.compute.manager [None req-1b158641-603a-44fd-bfb8-f2f09eb6f6c3 None None] [instance: 71bc60a8-8430-4110-aa0a-0141b6cf2277] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:36:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:36:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:13 user nova-compute[71628]: INFO nova.compute.manager [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Terminating instance Apr 17 17:37:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-unplugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] No waiting events found dispatching network-vif-unplugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-5e9c2338-ff87-4e74-b2c9-eda246b17259 req-87e4bf8a-8757-4867-8bdd-ea2094cfe685 service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-unplugged-5e63b915-b438-4230-9a55-9c4791efa048 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Instance destroyed successfully. Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'resources' on Instance uuid a3a06799-56ce-4121-93d7-e4f474afb487 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1441565732',display_name='tempest-ServersNegativeTestJSON-server-1441565732',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1441565732',id=11,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:35:30Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-r0yxbsl1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:35:31Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=a3a06799-56ce-4121-93d7-e4f474afb487,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "5e63b915-b438-4230-9a55-9c4791efa048", "address": "fa:16:3e:ab:2c:43", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e63b915-b4", "ovs_interfaceid": "5e63b915-b438-4230-9a55-9c4791efa048", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG os_vif [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e63b915-b4, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:14 user nova-compute[71628]: INFO os_vif [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:2c:43,bridge_name='br-int',has_traffic_filtering=True,id=5e63b915-b438-4230-9a55-9c4791efa048,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e63b915-b4') Apr 17 17:37:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Deleting instance files /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487_del Apr 17 17:37:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Deletion of /opt/stack/data/nova/instances/a3a06799-56ce-4121-93d7-e4f474afb487_del complete Apr 17 17:37:14 user nova-compute[71628]: INFO nova.compute.manager [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 17 17:37:14 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:37:14 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:15 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Took 0.47 seconds to deallocate network for instance. Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-666bf00d-1a8d-4e47-91a0-c75dd729a396 req-556872ab-143e-46c0-a85a-5cfb827540de service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-deleted-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:15 user nova-compute[71628]: INFO nova.compute.manager [req-666bf00d-1a8d-4e47-91a0-c75dd729a396 req-556872ab-143e-46c0-a85a-5cfb827540de service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Neutron deleted interface 5e63b915-b438-4230-9a55-9c4791efa048; detaching it from the instance and deleting it from the info cache Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-666bf00d-1a8d-4e47-91a0-c75dd729a396 req-556872ab-143e-46c0-a85a-5cfb827540de service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-666bf00d-1a8d-4e47-91a0-c75dd729a396 req-556872ab-143e-46c0-a85a-5cfb827540de service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Detach interface failed, port_id=5e63b915-b438-4230-9a55-9c4791efa048, reason: Instance a3a06799-56ce-4121-93d7-e4f474afb487 could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.318s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:15 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Deleted allocations for instance a3a06799-56ce-4121-93d7-e4f474afb487 Apr 17 17:37:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-9014bc44-ef4d-4895-ac20-bd990ec61562 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "a3a06799-56ce-4121-93d7-e4f474afb487" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.635s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] Acquiring lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] Lock "a3a06799-56ce-4121-93d7-e4f474afb487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] No waiting events found dispatching network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:16 user nova-compute[71628]: WARNING nova.compute.manager [req-7ff6e69a-4e43-4bc5-be81-f934ae530d71 req-b4fc7772-5260-445e-bdee-ba2e6c99aeed service nova] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Received unexpected event network-vif-plugged-5e63b915-b438-4230-9a55-9c4791efa048 for instance with vm_state deleted and task_state None. Apr 17 17:37:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:22 user nova-compute[71628]: INFO nova.compute.manager [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Terminating instance Apr 17 17:37:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-unplugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] No waiting events found dispatching network-vif-unplugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-3b2c7c9b-5807-4686-817b-01b43d23a41f req-254c4978-02d6-4660-bdbe-52e428116047 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-unplugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:37:23 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Instance destroyed successfully. Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.objects.instance [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lazy-loading 'resources' on Instance uuid e4d62df0-41e5-4351-a4de-5c0d88a9ab5f {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-208750553',display_name='tempest-ServerStableDeviceRescueTest-server-208750553',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-208750553',id=3,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAq+BzuxOK8loeWCX7+gm1vtBvHjKX/RB2/Ksbqc7d8Sd4w/uR+fvKlIbAKgPdXXHXmWGfsY644kMqiq2d9A+lV14DR/jIn5cY2HIczrH02adOJG295uKxk0lnYHrPw5Hw==',key_name='tempest-keypair-1425694538',keypairs=,launch_index=0,launched_at=2023-04-17T17:33:52Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='2b5443ac3e3e45888d6a42642e53c687',ramdisk_id='',reservation_id='r-zdz897w0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1249806725',owner_user_name='tempest-ServerStableDeviceRescueTest-1249806725-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:35:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a6e712adada44971a7fcac5fe1881883',uuid=e4d62df0-41e5-4351-a4de-5c0d88a9ab5f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.43", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converting VIF {"id": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "address": "fa:16:3e:22:b7:3b", "network": {"id": "ac40266d-4256-4260-b1ae-353bf8431bd0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1669869925-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.43", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2b5443ac3e3e45888d6a42642e53c687", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9b1ac87-92", "ovs_interfaceid": "f9b1ac87-92c9-4ca2-9721-54337c3c8811", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG os_vif [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9b1ac87-92, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:37:23 user nova-compute[71628]: INFO os_vif [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:b7:3b,bridge_name='br-int',has_traffic_filtering=True,id=f9b1ac87-92c9-4ca2-9721-54337c3c8811,network=Network(ac40266d-4256-4260-b1ae-353bf8431bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9b1ac87-92') Apr 17 17:37:23 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Deleting instance files /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f_del Apr 17 17:37:23 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Deletion of /opt/stack/data/nova/instances/e4d62df0-41e5-4351-a4de-5c0d88a9ab5f_del complete Apr 17 17:37:23 user nova-compute[71628]: INFO nova.compute.manager [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 17 17:37:23 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:37:23 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:37:24 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:25 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Took 1.37 seconds to deallocate network for instance. Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] Acquiring lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] No waiting events found dispatching network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:25 user nova-compute[71628]: WARNING nova.compute.manager [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received unexpected event network-vif-plugged-f9b1ac87-92c9-4ca2-9721-54337c3c8811 for instance with vm_state deleted and task_state None. Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-520467ce-1de3-4ed6-b4e4-cdab7785d44d req-b9b381ea-95a0-4490-98cf-82383d59d071 service nova] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Received event network-vif-deleted-f9b1ac87-92c9-4ca2-9721-54337c3c8811 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.251s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:25 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Deleted allocations for instance e4d62df0-41e5-4351-a4de-5c0d88a9ab5f Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-8a6c4a78-0f09-447a-a46d-86ae81813361 tempest-ServerStableDeviceRescueTest-1249806725 tempest-ServerStableDeviceRescueTest-1249806725-project-member] Lock "e4d62df0-41e5-4351-a4de-5c0d88a9ab5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.697s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:25 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:37:25 user nova-compute[71628]: INFO nova.compute.claims [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Claim successful on node user Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:37:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.policy [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c9e3172f6aeb401cbea2e81c86c614fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c589ed2b5c4abf9fab75e4c36dc3b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:37:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Creating image(s) Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "/opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "/opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "/opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk 1073741824" returned: 0 in 0.045s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Cannot resize image /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.objects.instance [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'migration_context' on Instance uuid dceda472-fdb2-481b-8be3-10a3411b793e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Ensure instance console log exists: /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Successfully created port: cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Successfully updated port: cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquired lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-changed-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Refreshing instance network info cache due to event network-changed-cd1abbbd-2cd8-431f-bd32-4824d370714c. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] Acquiring lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:37:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updating instance_info_cache with network_info: [{"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Releasing lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Instance network_info: |[{"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] Acquired lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Refreshing network info cache for port cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Start _get_guest_xml network_info=[{"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:37:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:37:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1617736297',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1617736297',id=14,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB7oj3RfhOOZsiMYhbgELLph77i7sLLfkOvTH2ZYOx1HfgFf0nbxf2T7jQ+RnBva7hLR6sz80T8q12VwQM0sZjdDcA1VNWsYvyjABWtgWWgYzEZTZqbMPJqWLlnOgR0W6w==',key_name='tempest-keypair-501140789',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-4ux0ipqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:37:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=dceda472-fdb2-481b-8be3-10a3411b793e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.objects.instance [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'pci_devices' on Instance uuid dceda472-fdb2-481b-8be3-10a3411b793e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] End _get_guest_xml xml= Apr 17 17:37:28 user nova-compute[71628]: dceda472-fdb2-481b-8be3-10a3411b793e Apr 17 17:37:28 user nova-compute[71628]: instance-0000000e Apr 17 17:37:28 user nova-compute[71628]: 131072 Apr 17 17:37:28 user nova-compute[71628]: 1 Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-server-1617736297 Apr 17 17:37:28 user nova-compute[71628]: 2023-04-17 17:37:28 Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: 128 Apr 17 17:37:28 user nova-compute[71628]: 1 Apr 17 17:37:28 user nova-compute[71628]: 0 Apr 17 17:37:28 user nova-compute[71628]: 0 Apr 17 17:37:28 user nova-compute[71628]: 1 Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-993962804-project-member Apr 17 17:37:28 user nova-compute[71628]: tempest-AttachVolumeShelveTestJSON-993962804 Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: OpenStack Foundation Apr 17 17:37:28 user nova-compute[71628]: OpenStack Nova Apr 17 17:37:28 user nova-compute[71628]: 0.0.0 Apr 17 17:37:28 user nova-compute[71628]: dceda472-fdb2-481b-8be3-10a3411b793e Apr 17 17:37:28 user nova-compute[71628]: dceda472-fdb2-481b-8be3-10a3411b793e Apr 17 17:37:28 user nova-compute[71628]: Virtual Machine Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: hvm Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Nehalem Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: /dev/urandom Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: Apr 17 17:37:28 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1617736297',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1617736297',id=14,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB7oj3RfhOOZsiMYhbgELLph77i7sLLfkOvTH2ZYOx1HfgFf0nbxf2T7jQ+RnBva7hLR6sz80T8q12VwQM0sZjdDcA1VNWsYvyjABWtgWWgYzEZTZqbMPJqWLlnOgR0W6w==',key_name='tempest-keypair-501140789',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-4ux0ipqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:37:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=dceda472-fdb2-481b-8be3-10a3411b793e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG os_vif [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd1abbbd-2c, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd1abbbd-2c, col_values=(('external_ids', {'iface-id': 'cd1abbbd-2cd8-431f-bd32-4824d370714c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:d5:c7', 'vm-uuid': 'dceda472-fdb2-481b-8be3-10a3411b793e'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:28 user nova-compute[71628]: INFO os_vif [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] No VIF found with MAC fa:16:3e:c6:d5:c7, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updated VIF entry in instance network info cache for port cd1abbbd-2cd8-431f-bd32-4824d370714c. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updating instance_info_cache with network_info: [{"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-91e16e40-cd39-4b4b-9936-ea1d6b89e082 req-517aef25-b63a-4207-bb87-91dc341c6ed2 service nova] Releasing lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:37:29 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] VM Stopped (Lifecycle Event) Apr 17 17:37:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-916bce51-06d4-43ce-b36a-a8127742c840 None None] [instance: a3a06799-56ce-4121-93d7-e4f474afb487] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] No waiting events found dispatching network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:29 user nova-compute[71628]: WARNING nova.compute.manager [req-1e24af5b-0451-488b-b829-3d79fdcb1b49 req-05d007e4-bbb4-4bca-93b6-1ebfe5a11fc6 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received unexpected event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c for instance with vm_state building and task_state spawning. Apr 17 17:37:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:37:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] VM Resumed (Lifecycle Event) Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:37:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Instance spawned successfully. Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:37:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:37:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] VM Started (Lifecycle Event) Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:37:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] No waiting events found dispatching network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:31 user nova-compute[71628]: WARNING nova.compute.manager [req-43ae0e58-6462-4d1b-8448-ff394e00260f req-55c8b985-0c99-434b-92af-3dcf0f5ca46f service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received unexpected event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c for instance with vm_state building and task_state spawning. Apr 17 17:37:31 user nova-compute[71628]: INFO nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Took 5.38 seconds to spawn the instance on the hypervisor. Apr 17 17:37:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: INFO nova.compute.manager [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Took 6.11 seconds to build instance. Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d99ada2e-0790-4569-a0e4-097394be5640 tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.239s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:37:33 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:37:34 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:37:34 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8386MB free_disk=26.440433502197266GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 0711a965-58ba-4238-aa35-b7f3d762c97d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 35fe8580-9a68-44c2-8b86-9c28144bd2f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 724dac7a-d0c4-47c5-9faf-c32e8cab0459 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance dceda472-fdb2-481b-8be3-10a3411b793e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:37:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.400s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:37:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:37:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "7924fef7-eb7b-4919-b22d-d048efe4d4a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "7924fef7-eb7b-4919-b22d-d048efe4d4a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:37:38 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] VM Stopped (Lifecycle Event) Apr 17 17:37:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-ea773920-5713-4292-830a-49317663058f None None] [instance: e4d62df0-41e5-4351-a4de-5c0d88a9ab5f] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:37:38 user nova-compute[71628]: INFO nova.compute.claims [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Claim successful on node user Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:37:39 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.policy [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3fc1bd85d647d7b1eabca4bf49d42f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63c872fffe164507ab615963a791bfb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:37:39 user nova-compute[71628]: INFO nova.virt.block_device [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Booting with blank volume at /dev/vda Apr 17 17:37:39 user nova-compute[71628]: WARNING nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Volume id: 135cdd04-bc6b-41af-959b-3aecd7001e19 finished being created but its status is error. Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Traceback (most recent call last): Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] driver_block_device.attach_block_devices( Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] _log_and_attach(device) Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] bdm.attach(*attach_args, **attach_kwargs) Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] self.volume_id, self.attachment_id = self._create_volume( Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] with excutils.save_and_reraise_exception(): Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] self.force_reraise() Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] raise self.value Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] wait_func(context, volume_id) Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] nova.exception.VolumeNotCreated: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 17:37:39 user nova-compute[71628]: ERROR nova.compute.manager [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Apr 17 17:37:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Successfully created port: e045bd47-7428-478d-9a28-a44df42293b1 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Successfully updated port: e045bd47-7428-478d-9a28-a44df42293b1 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquired lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Received event network-changed-e045bd47-7428-478d-9a28-a44df42293b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.compute.manager [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Refreshing instance network info cache due to event network-changed-e045bd47-7428-478d-9a28-a44df42293b1. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] Acquiring lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Updating instance_info_cache with network_info: [{"id": "e045bd47-7428-478d-9a28-a44df42293b1", "address": "fa:16:3e:a1:74:bc", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape045bd47-74", "ovs_interfaceid": "e045bd47-7428-478d-9a28-a44df42293b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Releasing lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Instance network_info: |[{"id": "e045bd47-7428-478d-9a28-a44df42293b1", "address": "fa:16:3e:a1:74:bc", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape045bd47-74", "ovs_interfaceid": "e045bd47-7428-478d-9a28-a44df42293b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] Acquired lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.network.neutron [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Refreshing network info cache for port e045bd47-7428-478d-9a28-a44df42293b1 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG nova.compute.claims [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Aborting claim: {{(pid=71628) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.332s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Build of instance 7924fef7-eb7b-4919-b22d-d048efe4d4a8 aborted: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.utils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Build of instance 7924fef7-eb7b-4919-b22d-d048efe4d4a8 aborted: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71628) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 17 17:37:41 user nova-compute[71628]: ERROR nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Build of instance 7924fef7-eb7b-4919-b22d-d048efe4d4a8 aborted: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 7924fef7-eb7b-4919-b22d-d048efe4d4a8 aborted: Volume 135cdd04-bc6b-41af-959b-3aecd7001e19 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Unplugging VIFs for instance {{(pid=71628) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:37:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-543378232',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-543378232',id=15,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-vu0somlh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:37:39Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=7924fef7-eb7b-4919-b22d-d048efe4d4a8,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e045bd47-7428-478d-9a28-a44df42293b1", "address": "fa:16:3e:a1:74:bc", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape045bd47-74", "ovs_interfaceid": "e045bd47-7428-478d-9a28-a44df42293b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "e045bd47-7428-478d-9a28-a44df42293b1", "address": "fa:16:3e:a1:74:bc", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape045bd47-74", "ovs_interfaceid": "e045bd47-7428-478d-9a28-a44df42293b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:74:bc,bridge_name='br-int',has_traffic_filtering=True,id=e045bd47-7428-478d-9a28-a44df42293b1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape045bd47-74') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG os_vif [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:74:bc,bridge_name='br-int',has_traffic_filtering=True,id=e045bd47-7428-478d-9a28-a44df42293b1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape045bd47-74') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape045bd47-74, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:37:41 user nova-compute[71628]: INFO os_vif [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:74:bc,bridge_name='br-int',has_traffic_filtering=True,id=e045bd47-7428-478d-9a28-a44df42293b1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape045bd47-74') Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Unplugged VIFs for instance {{(pid=71628) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Updated VIF entry in instance network info cache for port e045bd47-7428-478d-9a28-a44df42293b1. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Updating instance_info_cache with network_info: [{"id": "e045bd47-7428-478d-9a28-a44df42293b1", "address": "fa:16:3e:a1:74:bc", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape045bd47-74", "ovs_interfaceid": "e045bd47-7428-478d-9a28-a44df42293b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9851d057-7918-4c5d-9d90-cf2e4987b145 req-b795a4a7-89dd-4f5e-a83b-e1642e90ba86 service nova] Releasing lock "refresh_cache-7924fef7-eb7b-4919-b22d-d048efe4d4a8" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:37:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:41 user nova-compute[71628]: INFO nova.compute.manager [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 7924fef7-eb7b-4919-b22d-d048efe4d4a8] Took 0.50 seconds to deallocate network for instance. Apr 17 17:37:41 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Deleted allocations for instance 7924fef7-eb7b-4919-b22d-d048efe4d4a8 Apr 17 17:37:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-a654f3a5-5bca-448e-a45c-26b4e794065d tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "7924fef7-eb7b-4919-b22d-d048efe4d4a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.285s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 2098-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:43 user nova-compute[71628]: INFO nova.compute.manager [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Terminating instance Apr 17 17:37:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:37:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-unplugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] No waiting events found dispatching network-vif-unplugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.compute.manager [req-e7fcfb40-c39a-4961-98a0-309393d68ca0 req-5b18e2d4-0688-4550-85ac-43e12a953022 service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-unplugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:44 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Instance destroyed successfully. Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.objects.instance [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'resources' on Instance uuid 35fe8580-9a68-44c2-8b86-9c28144bd2f1 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:35:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-996763478',display_name='tempest-VolumesAdminNegativeTest-server-996763478',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-996763478',id=12,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:35:58Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-4t0vnoog',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:35:58Z,user_data=None,user_id='cb9f6038c3d94f4b8176f52308996012',uuid=35fe8580-9a68-44c2-8b86-9c28144bd2f1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "address": "fa:16:3e:df:11:e4", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bf971a6-cc", "ovs_interfaceid": "5bf971a6-cc65-49ba-a2d2-4bb6ac641771", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG os_vif [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bf971a6-cc, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:37:44 user nova-compute[71628]: INFO os_vif [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:11:e4,bridge_name='br-int',has_traffic_filtering=True,id=5bf971a6-cc65-49ba-a2d2-4bb6ac641771,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5bf971a6-cc') Apr 17 17:37:44 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Deleting instance files /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1_del Apr 17 17:37:44 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Deletion of /opt/stack/data/nova/instances/35fe8580-9a68-44c2-8b86-9c28144bd2f1_del complete Apr 17 17:37:44 user nova-compute[71628]: INFO nova.compute.manager [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 17 17:37:44 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:37:44 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:37:45 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Took 0.50 seconds to deallocate network for instance. Apr 17 17:37:45 user nova-compute[71628]: DEBUG nova.compute.manager [req-8170e32e-cfcb-4149-9d5f-8f6b88f48f4a req-6753e259-6de2-4bca-9784-df7ab9b505bb service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-deleted-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:37:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.233s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:45 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Deleted allocations for instance 35fe8580-9a68-44c2-8b86-9c28144bd2f1 Apr 17 17:37:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1c6ccc69-7185-4fd8-bb32-bcb37fac645c tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.556s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:37:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] Acquiring lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:37:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:37:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] Lock "35fe8580-9a68-44c2-8b86-9c28144bd2f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:37:46 user nova-compute[71628]: DEBUG nova.compute.manager [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] No waiting events found dispatching network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:37:46 user nova-compute[71628]: WARNING nova.compute.manager [req-71555b74-d772-4b52-b900-85f4e60f9a68 req-3153d677-0693-4b6e-a1a6-7e922828e31e service nova] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Received unexpected event network-vif-plugged-5bf971a6-cc65-49ba-a2d2-4bb6ac641771 for instance with vm_state deleted and task_state None. Apr 17 17:37:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:37:59 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:37:59 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] VM Stopped (Lifecycle Event) Apr 17 17:37:59 user nova-compute[71628]: DEBUG nova.compute.manager [None req-c40bc81c-3d1b-4afb-bd35-db1b8baa9464 None None] [instance: 35fe8580-9a68-44c2-8b86-9c28144bd2f1] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:37:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:38:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-changed-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Refreshing instance network info cache due to event network-changed-69672cf1-09b2-4035-8125-023e26e1c6f6. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] Acquiring lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] Acquired lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG nova.network.neutron [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Refreshing network info cache for port 69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG nova.network.neutron [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updated VIF entry in instance network info cache for port 69672cf1-09b2-4035-8125-023e26e1c6f6. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG nova.network.neutron [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updating instance_info_cache with network_info: [{"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-706a5a7b-4973-46d6-a571-e44c740674be req-f945d7aa-5d5b-4510-982d-eaf04c9a71ce service nova] Releasing lock "refresh_cache-724dac7a-d0c4-47c5-9faf-c32e8cab0459" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:17 user nova-compute[71628]: INFO nova.compute.manager [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Terminating instance Apr 17 17:38:17 user nova-compute[71628]: DEBUG nova.compute.manager [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-unplugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] No waiting events found dispatching network-vif-unplugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-16bf91ad-83c6-4c78-8449-8240955e79ad req-aeca718a-dd36-40ec-b439-f34672afc971 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-unplugged-69672cf1-09b2-4035-8125-023e26e1c6f6 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:38:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Instance destroyed successfully. Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.objects.instance [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'resources' on Instance uuid 724dac7a-d0c4-47c5-9faf-c32e8cab0459 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:36:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1425938846',display_name='tempest-AttachVolumeNegativeTest-server-1425938846',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1425938846',id=13,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFPYGM7TCIEea2NrfYqx3cHOOTL3mi2XVT7f+gq/ZodBX91kDRxNKVeDpbp07ToQ/gejuEPAZmv04W2fC3xC4FAc2yfJETAIg24f1z/1RTRoS+gxfXI31WCaXi4xYCRgVA==',key_name='tempest-keypair-466913357',keypairs=,launch_index=0,launched_at=2023-04-17T17:36:34Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-adk0pmn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:36:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=724dac7a-d0c4-47c5-9faf-c32e8cab0459,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "69672cf1-09b2-4035-8125-023e26e1c6f6", "address": "fa:16:3e:2b:d9:2c", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69672cf1-09", "ovs_interfaceid": "69672cf1-09b2-4035-8125-023e26e1c6f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG os_vif [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69672cf1-09, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:18 user nova-compute[71628]: INFO os_vif [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:d9:2c,bridge_name='br-int',has_traffic_filtering=True,id=69672cf1-09b2-4035-8125-023e26e1c6f6,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69672cf1-09') Apr 17 17:38:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Deleting instance files /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459_del Apr 17 17:38:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Deletion of /opt/stack/data/nova/instances/724dac7a-d0c4-47c5-9faf-c32e8cab0459_del complete Apr 17 17:38:18 user nova-compute[71628]: INFO nova.compute.manager [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 17 17:38:18 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:38:18 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:38:19 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:19 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Took 0.98 seconds to deallocate network for instance. Apr 17 17:38:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:19 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:38:19 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:38:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.219s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:20 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Deleted allocations for instance 724dac7a-d0c4-47c5-9faf-c32e8cab0459 Apr 17 17:38:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-74f41fec-0747-48aa-aec8-6c851442419c tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.229s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] Acquiring lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] Lock "724dac7a-d0c4-47c5-9faf-c32e8cab0459-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] No waiting events found dispatching network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:20 user nova-compute[71628]: WARNING nova.compute.manager [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received unexpected event network-vif-plugged-69672cf1-09b2-4035-8125-023e26e1c6f6 for instance with vm_state deleted and task_state None. Apr 17 17:38:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-bc2074a1-2726-4471-b5c1-cdcd569539b6 req-dd0e7802-7693-4fd0-9d3b-bcb5b5a368f3 service nova] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Received event network-vif-deleted-69672cf1-09b2-4035-8125-023e26e1c6f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:26 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:38:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:29 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:38:30 user nova-compute[71628]: INFO nova.compute.claims [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Claim successful on node user Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:38:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.772s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:38:31 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:38:31 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8503MB free_disk=26.45663833618164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:38:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 0711a965-58ba-4238-aa35-b7f3d762c97d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance dceda472-fdb2-481b-8be3-10a3411b793e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 9149e658-c82c-4562-be77-ce741c7cd48e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.policy [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3fc1bd85d647d7b1eabca4bf49d42f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63c872fffe164507ab615963a791bfb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:38:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Creating image(s) Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "/opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "/opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "/opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.159s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk 1073741824" returned: 0 in 0.077s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.212s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Checking if we can resize image /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Cannot resize image /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.objects.instance [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'migration_context' on Instance uuid 9149e658-c82c-4562-be77-ce741c7cd48e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Ensure instance console log exists: /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:31 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Successfully created port: 3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:38:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Successfully updated port: 3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquired lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-changed-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Refreshing instance network info cache due to event network-changed-3b0bc315-e7d4-4753-b2aa-490ef430bec1. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] Acquiring lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Updating instance_info_cache with network_info: [{"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Skipping network cache update for instance because it is Building. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Releasing lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Instance network_info: |[{"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] Acquired lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Refreshing network info cache for port 3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Start _get_guest_xml network_info=[{"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:38:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:38:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:38:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1334242229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1334242229',id=16,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-r3edgp4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:38:31Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=9149e658-c82c-4562-be77-ce741c7cd48e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.objects.instance [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'pci_devices' on Instance uuid 9149e658-c82c-4562-be77-ce741c7cd48e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] End _get_guest_xml xml= Apr 17 17:38:33 user nova-compute[71628]: 9149e658-c82c-4562-be77-ce741c7cd48e Apr 17 17:38:33 user nova-compute[71628]: instance-00000010 Apr 17 17:38:33 user nova-compute[71628]: 131072 Apr 17 17:38:33 user nova-compute[71628]: 1 Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-server-1334242229 Apr 17 17:38:33 user nova-compute[71628]: 2023-04-17 17:38:33 Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: 128 Apr 17 17:38:33 user nova-compute[71628]: 1 Apr 17 17:38:33 user nova-compute[71628]: 0 Apr 17 17:38:33 user nova-compute[71628]: 0 Apr 17 17:38:33 user nova-compute[71628]: 1 Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member Apr 17 17:38:33 user nova-compute[71628]: tempest-ServerBootFromVolumeStableRescueTest-1793110919 Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: OpenStack Foundation Apr 17 17:38:33 user nova-compute[71628]: OpenStack Nova Apr 17 17:38:33 user nova-compute[71628]: 0.0.0 Apr 17 17:38:33 user nova-compute[71628]: 9149e658-c82c-4562-be77-ce741c7cd48e Apr 17 17:38:33 user nova-compute[71628]: 9149e658-c82c-4562-be77-ce741c7cd48e Apr 17 17:38:33 user nova-compute[71628]: Virtual Machine Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: hvm Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Nehalem Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: /dev/urandom Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: Apr 17 17:38:33 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:38:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1334242229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1334242229',id=16,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-r3edgp4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:38:31Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=9149e658-c82c-4562-be77-ce741c7cd48e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG os_vif [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b0bc315-e7, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b0bc315-e7, col_values=(('external_ids', {'iface-id': '3b0bc315-e7d4-4753-b2aa-490ef430bec1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:2b:d3', 'vm-uuid': '9149e658-c82c-4562-be77-ce741c7cd48e'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:33 user nova-compute[71628]: INFO os_vif [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] No VIF found with MAC fa:16:3e:71:2b:d3, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:38:33 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] VM Stopped (Lifecycle Event) Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-a1226889-bc64-45e2-836d-c76c9439f446 None None] [instance: 724dac7a-d0c4-47c5-9faf-c32e8cab0459] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Updated VIF entry in instance network info cache for port 3b0bc315-e7d4-4753-b2aa-490ef430bec1. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG nova.network.neutron [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Updating instance_info_cache with network_info: [{"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a9dca7-bd10-4046-b35a-0b97b133f974 req-4cf4dfbf-270f-409f-a182-c9ac04ca2234 service nova] Releasing lock "refresh_cache-9149e658-c82c-4562-be77-ce741c7cd48e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updating instance_info_cache with network_info: [{"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-b5fd68bf-3827-41f7-9ffa-ce1060e95f58" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:34 user nova-compute[71628]: INFO nova.compute.manager [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Terminating instance Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.compute.manager [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-unplugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.compute.manager [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] No waiting events found dispatching network-vif-unplugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG nova.compute.manager [req-cf679936-285c-485d-9751-57c96634e310 req-b13e18fe-6e23-429c-becc-82622e86a497 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-unplugged-358fa886-02f3-433a-a1af-d4d2bff8be35 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.compute.manager [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:35 user nova-compute[71628]: WARNING nova.compute.manager [req-1fb81a58-a628-42f4-8497-661a17e665ed req-b9ef2811-2f78-4262-8e12-8103ce77336c service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state building and task_state spawning. Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Instance destroyed successfully. Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.objects.instance [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lazy-loading 'resources' on Instance uuid 0711a965-58ba-4238-aa35-b7f3d762c97d {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-796323267',display_name='tempest-VolumesAdminNegativeTest-server-796323267',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-796323267',id=5,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNA3nzT/eQwwUVq7FVq+WQky0pPjQAJUFSsfHG4AY4cBLpdgzedNQS6Cc0CHRBOmzmW0iPUkSyxk2SpIdk+jLjZmz+UsqyxxI97a2YS5M9WcvyIhUz4nuSC3800u6FkZg==',key_name='tempest-keypair-1999371266',keypairs=,launch_index=0,launched_at=2023-04-17T17:34:10Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='bd1e8586846543c88d468bb6b705d4a6',ramdisk_id='',reservation_id='r-rzww6pwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-1858597906',owner_user_name='tempest-VolumesAdminNegativeTest-1858597906-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:34:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cb9f6038c3d94f4b8176f52308996012',uuid=0711a965-58ba-4238-aa35-b7f3d762c97d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converting VIF {"id": "358fa886-02f3-433a-a1af-d4d2bff8be35", "address": "fa:16:3e:a1:28:88", "network": {"id": "f1e38cf4-11c1-4f1d-a1f2-15d65da31617", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1171459644-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bd1e8586846543c88d468bb6b705d4a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap358fa886-02", "ovs_interfaceid": "358fa886-02f3-433a-a1af-d4d2bff8be35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG os_vif [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap358fa886-02, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:38:35 user nova-compute[71628]: INFO os_vif [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:28:88,bridge_name='br-int',has_traffic_filtering=True,id=358fa886-02f3-433a-a1af-d4d2bff8be35,network=Network(f1e38cf4-11c1-4f1d-a1f2-15d65da31617),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap358fa886-02') Apr 17 17:38:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Deleting instance files /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d_del Apr 17 17:38:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Deletion of /opt/stack/data/nova/instances/0711a965-58ba-4238-aa35-b7f3d762c97d_del complete Apr 17 17:38:35 user nova-compute[71628]: INFO nova.compute.manager [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Took 1.17 seconds to destroy the instance on the hypervisor. Apr 17 17:38:35 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:38:35 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:38:36 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Took 0.85 seconds to deallocate network for instance. Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:36 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Deleted allocations for instance 0711a965-58ba-4238-aa35-b7f3d762c97d Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] Acquiring lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] No waiting events found dispatching network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:36 user nova-compute[71628]: WARNING nova.compute.manager [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received unexpected event network-vif-plugged-358fa886-02f3-433a-a1af-d4d2bff8be35 for instance with vm_state deleted and task_state None. Apr 17 17:38:36 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef3ff463-2940-4a83-9850-b7399282e1e5 req-2ca19e39-a38c-4b55-a2ef-c3bad92fc151 service nova] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Received event network-vif-deleted-358fa886-02f3-433a-a1af-d4d2bff8be35 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-81975eda-eaa3-4d0c-a4fa-d3a7278f04a3 tempest-VolumesAdminNegativeTest-1858597906 tempest-VolumesAdminNegativeTest-1858597906-project-member] Lock "0711a965-58ba-4238-aa35-b7f3d762c97d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.447s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:38:37 user nova-compute[71628]: WARNING nova.compute.manager [req-4b3ce02d-7407-4ab9-9699-c0b1e5734fb1 req-ab09b59c-f0a5-49b8-9321-2f88e05422b3 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state building and task_state spawning. Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] VM Resumed (Lifecycle Event) Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Instance spawned successfully. Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] VM Started (Lifecycle Event) Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Took 6.21 seconds to spawn the instance on the hypervisor. Apr 17 17:38:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:38:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:37 user nova-compute[71628]: INFO nova.compute.manager [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Took 7.30 seconds to build instance. Apr 17 17:38:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-18bf80be-2659-4ab4-9bd5-66ee42785e6e tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.401s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:38:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:50 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:38:50 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] VM Stopped (Lifecycle Event) Apr 17 17:38:50 user nova-compute[71628]: DEBUG nova.compute.manager [None req-6b31de13-54ae-4551-9b0f-12f977f07fe0 None None] [instance: 0711a965-58ba-4238-aa35-b7f3d762c97d] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:38:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:38:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:00 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:39:12 user nova-compute[71628]: INFO nova.compute.claims [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Claim successful on node user Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:39:12 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:39:12 user nova-compute[71628]: DEBUG nova.policy [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cbcda57441d43e0bb8dfee4768df2a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70d10a379e4e420e9c66476ae0b10507', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:39:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Creating image(s) Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "/opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "/opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.186s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk 1073741824" returned: 0 in 0.048s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Checking if we can resize image /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Successfully created port: 32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Cannot resize image /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.objects.instance [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'migration_context' on Instance uuid 30e45106-0ac1-4580-9945-e90c1a410e21 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Ensure instance console log exists: /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Successfully updated port: 32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquired lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-changed-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Refreshing instance network info cache due to event network-changed-32bf69f3-b016-42ff-967d-6d437b60953a. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] Acquiring lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updating instance_info_cache with network_info: [{"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Releasing lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Instance network_info: |[{"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] Acquired lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Refreshing network info cache for port 32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Start _get_guest_xml network_info=[{"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:39:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:39:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1911310289',display_name='tempest-AttachVolumeNegativeTest-server-1911310289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1911310289',id=17,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAo9N9SBuJhclT8juo+8b12wFHt6NAGkiNJ7ehKBJpONVCEdE5A5Z07CTu/29gUxG4MbfJbnV7zB79yx1xwbUeMcly3EFdfbDUvERXf8MjB5TvVw2q1v1JpXpPJwCgl4Q==',key_name='tempest-keypair-1774007176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-3tds31ez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:39:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=30e45106-0ac1-4580-9945-e90c1a410e21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'pci_devices' on Instance uuid 30e45106-0ac1-4580-9945-e90c1a410e21 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] End _get_guest_xml xml= Apr 17 17:39:14 user nova-compute[71628]: 30e45106-0ac1-4580-9945-e90c1a410e21 Apr 17 17:39:14 user nova-compute[71628]: instance-00000011 Apr 17 17:39:14 user nova-compute[71628]: 131072 Apr 17 17:39:14 user nova-compute[71628]: 1 Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-server-1911310289 Apr 17 17:39:14 user nova-compute[71628]: 2023-04-17 17:39:14 Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: 128 Apr 17 17:39:14 user nova-compute[71628]: 1 Apr 17 17:39:14 user nova-compute[71628]: 0 Apr 17 17:39:14 user nova-compute[71628]: 0 Apr 17 17:39:14 user nova-compute[71628]: 1 Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846-project-member Apr 17 17:39:14 user nova-compute[71628]: tempest-AttachVolumeNegativeTest-469494846 Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: OpenStack Foundation Apr 17 17:39:14 user nova-compute[71628]: OpenStack Nova Apr 17 17:39:14 user nova-compute[71628]: 0.0.0 Apr 17 17:39:14 user nova-compute[71628]: 30e45106-0ac1-4580-9945-e90c1a410e21 Apr 17 17:39:14 user nova-compute[71628]: 30e45106-0ac1-4580-9945-e90c1a410e21 Apr 17 17:39:14 user nova-compute[71628]: Virtual Machine Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: hvm Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Nehalem Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: /dev/urandom Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: Apr 17 17:39:14 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1911310289',display_name='tempest-AttachVolumeNegativeTest-server-1911310289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1911310289',id=17,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAo9N9SBuJhclT8juo+8b12wFHt6NAGkiNJ7ehKBJpONVCEdE5A5Z07CTu/29gUxG4MbfJbnV7zB79yx1xwbUeMcly3EFdfbDUvERXf8MjB5TvVw2q1v1JpXpPJwCgl4Q==',key_name='tempest-keypair-1774007176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-3tds31ez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:39:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=30e45106-0ac1-4580-9945-e90c1a410e21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG os_vif [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32bf69f3-b0, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap32bf69f3-b0, col_values=(('external_ids', {'iface-id': '32bf69f3-b016-42ff-967d-6d437b60953a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:b3:d0', 'vm-uuid': '30e45106-0ac1-4580-9945-e90c1a410e21'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:14 user nova-compute[71628]: INFO os_vif [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:39:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] No VIF found with MAC fa:16:3e:08:b3:d0, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:39:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updated VIF entry in instance network info cache for port 32bf69f3-b016-42ff-967d-6d437b60953a. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:39:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updating instance_info_cache with network_info: [{"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:39:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-19fb9f29-68d6-44b5-a490-c1eebf25493c req-1bbb1b9a-0c90-4fd2-8d35-63db93e73346 service nova] Releasing lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] No waiting events found dispatching network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:39:16 user nova-compute[71628]: WARNING nova.compute.manager [req-f81551ac-6ef7-4ded-9979-8d65dd48ec28 req-66cf5d46-0755-4256-b802-88e131d085c1 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received unexpected event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a for instance with vm_state building and task_state spawning. Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-changed-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Refreshing instance network info cache due to event network-changed-cd1abbbd-2cd8-431f-bd32-4824d370714c. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] Acquiring lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] Acquired lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Refreshing network info cache for port cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updated VIF entry in instance network info cache for port cd1abbbd-2cd8-431f-bd32-4824d370714c. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG nova.network.neutron [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updating instance_info_cache with network_info: [{"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.4", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:39:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-eeeac4c1-49bb-406c-b2e2-40867af399a6 req-756ba61c-2873-40de-aa0b-17d0c9596092 service nova] Releasing lock "refresh_cache-dceda472-fdb2-481b-8be3-10a3411b793e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] VM Resumed (Lifecycle Event) Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Instance spawned successfully. Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] VM Started (Lifecycle Event) Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Took 5.40 seconds to spawn the instance on the hypervisor. Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] No waiting events found dispatching network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:39:18 user nova-compute[71628]: WARNING nova.compute.manager [req-833694f2-842d-4b81-9aa5-bdac7f8cdda3 req-0739d907-a9e9-4da1-b6b0-60eee51f818d service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received unexpected event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a for instance with vm_state building and task_state spawning. Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Took 6.10 seconds to build instance. Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-dbfc201d-c003-48a7-834d-43247589516e tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.191s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:18 user nova-compute[71628]: INFO nova.compute.manager [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Terminating instance Apr 17 17:39:18 user nova-compute[71628]: DEBUG nova.compute.manager [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-unplugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] No waiting events found dispatching network-vif-unplugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-unplugged-cd1abbbd-2cd8-431f-bd32-4824d370714c for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Acquiring lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] Lock "dceda472-fdb2-481b-8be3-10a3411b793e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] No waiting events found dispatching network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:39:19 user nova-compute[71628]: WARNING nova.compute.manager [req-e6de7afb-8069-46d1-aab3-43a7fed3f2f0 req-087e788b-39cf-4853-b350-e28037f4ad21 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received unexpected event network-vif-plugged-cd1abbbd-2cd8-431f-bd32-4824d370714c for instance with vm_state active and task_state deleting. Apr 17 17:39:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Instance destroyed successfully. Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.objects.instance [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lazy-loading 'resources' on Instance uuid dceda472-fdb2-481b-8be3-10a3411b793e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:37:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1617736297',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1617736297',id=14,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB7oj3RfhOOZsiMYhbgELLph77i7sLLfkOvTH2ZYOx1HfgFf0nbxf2T7jQ+RnBva7hLR6sz80T8q12VwQM0sZjdDcA1VNWsYvyjABWtgWWgYzEZTZqbMPJqWLlnOgR0W6w==',key_name='tempest-keypair-501140789',keypairs=,launch_index=0,launched_at=2023-04-17T17:37:31Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='76c589ed2b5c4abf9fab75e4c36dc3b7',ramdisk_id='',reservation_id='r-4ux0ipqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-993962804',owner_user_name='tempest-AttachVolumeShelveTestJSON-993962804-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:37:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c9e3172f6aeb401cbea2e81c86c614fd',uuid=dceda472-fdb2-481b-8be3-10a3411b793e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.4", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converting VIF {"id": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "address": "fa:16:3e:c6:d5:c7", "network": {"id": "e0742a03-1fe4-4912-8730-b7fd4fdc4bf3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1075481871-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.4", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "76c589ed2b5c4abf9fab75e4c36dc3b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd1abbbd-2c", "ovs_interfaceid": "cd1abbbd-2cd8-431f-bd32-4824d370714c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG os_vif [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd1abbbd-2c, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:39:19 user nova-compute[71628]: INFO os_vif [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:d5:c7,bridge_name='br-int',has_traffic_filtering=True,id=cd1abbbd-2cd8-431f-bd32-4824d370714c,network=Network(e0742a03-1fe4-4912-8730-b7fd4fdc4bf3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd1abbbd-2c') Apr 17 17:39:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Deleting instance files /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e_del Apr 17 17:39:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Deletion of /opt/stack/data/nova/instances/dceda472-fdb2-481b-8be3-10a3411b793e_del complete Apr 17 17:39:19 user nova-compute[71628]: INFO nova.compute.manager [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 17 17:39:19 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:39:19 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:39:20 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Took 0.79 seconds to deallocate network for instance. Apr 17 17:39:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-b3619d55-6866-4b7a-ad19-b2f924535f73 req-3703bb02-4220-4087-abe5-61a2b7e2a7e0 service nova] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Received event network-vif-deleted-cd1abbbd-2cd8-431f-bd32-4824d370714c {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:39:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.281s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:20 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Deleted allocations for instance dceda472-fdb2-481b-8be3-10a3411b793e Apr 17 17:39:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-31f682e5-c7cb-4478-8381-bad39f0b4ffb tempest-AttachVolumeShelveTestJSON-993962804 tempest-AttachVolumeShelveTestJSON-993962804-project-member] Lock "dceda472-fdb2-481b-8be3-10a3411b793e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.128s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] There are 0 instances to clean {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 17:39:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:28 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:28 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:30 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:39:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:39:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:39:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8448MB free_disk=26.443927764892578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 9149e658-c82c-4562-be77-ce741c7cd48e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 30e45106-0ac1-4580-9945-e90c1a410e21 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:39:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.424s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances with incomplete migration {{(pid=71628) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 17:39:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:34 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:39:34 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] VM Stopped (Lifecycle Event) Apr 17 17:39:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-bde5c578-7b87-43e6-9ede-444b6626fd95 None None] [instance: dceda472-fdb2-481b-8be3-10a3411b793e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:39:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Triggering sync for uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Triggering sync for uuid 337c511a-a2ed-484e-ab48-31618fa2755e {{(pid=71628) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Triggering sync for uuid 47d30b1a-fc08-4cad-8a2e-003b43251518 {{(pid=71628) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Triggering sync for uuid 9149e658-c82c-4562-be77-ce741c7cd48e {{(pid=71628) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Triggering sync for uuid 30e45106-0ac1-4580-9945-e90c1a410e21 {{(pid=71628) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.057s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.056s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.065s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.069s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:53 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.070s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:39:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:39:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:40:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:40:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:40:21 user nova-compute[71628]: INFO nova.compute.manager [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] instance snapshotting Apr 17 17:40:21 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Beginning live snapshot process Apr 17 17:40:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json -f qcow2" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json -f qcow2 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json -f qcow2" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819.delta 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819.delta 1073741824" returned: 0 in 0.048s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:22 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 17:40:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:40:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.guest [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71628) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 17:40:23 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 17:40:23 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:40:23 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819.delta /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:24 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819.delta /opt/stack/data/nova/instances/snapshots/tmpa1njbkk4/ce9ec35f1ab64c12a92a61dcb7547819" returned: 0 in 0.320s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:24 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Snapshot extracted, beginning image upload Apr 17 17:40:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Snapshot image upload complete Apr 17 17:40:26 user nova-compute[71628]: INFO nova.compute.manager [None req-0bfce399-1fb3-488b-9eec-c8bb6fac92d1 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Took 4.58 seconds to snapshot the instance on the hypervisor. Apr 17 17:40:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:40:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:40:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:40:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8665MB free_disk=26.392723083496094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 9149e658-c82c-4562-be77-ce741c7cd48e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 30e45106-0ac1-4580-9945-e90c1a410e21 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing inventories for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating ProviderTree inventory for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing aggregate associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, aggregates: None {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing trait associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:40:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.684s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:40:34 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updating instance_info_cache with network_info: [{"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-47d30b1a-fc08-4cad-8a2e-003b43251518" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:40:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:40:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:40:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-changed-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Refreshing instance network info cache due to event network-changed-32bf69f3-b016-42ff-967d-6d437b60953a. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:41:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] Acquiring lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:41:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] Acquired lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:41:03 user nova-compute[71628]: DEBUG nova.network.neutron [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Refreshing network info cache for port 32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:41:04 user nova-compute[71628]: DEBUG nova.network.neutron [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updated VIF entry in instance network info cache for port 32bf69f3-b016-42ff-967d-6d437b60953a. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:41:04 user nova-compute[71628]: DEBUG nova.network.neutron [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updating instance_info_cache with network_info: [{"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.91", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-e2478900-fb0a-4aaf-853c-79bb9af24c11 req-bcb7fd7d-beb7-4b8a-a6e7-f8d26010db2a service nova] Releasing lock "refresh_cache-30e45106-0ac1-4580-9945-e90c1a410e21" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:41:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:05 user nova-compute[71628]: INFO nova.compute.manager [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Terminating instance Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.compute.manager [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.compute.manager [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-unplugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.compute.manager [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] No waiting events found dispatching network-vif-unplugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.compute.manager [req-419943ed-3b95-464e-87e5-be82c6468d3a req-bdc47a46-c135-460c-bf4e-ed6a04f6818f service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-unplugged-32bf69f3-b016-42ff-967d-6d437b60953a for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Instance destroyed successfully. Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.objects.instance [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lazy-loading 'resources' on Instance uuid 30e45106-0ac1-4580-9945-e90c1a410e21 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:39:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1911310289',display_name='tempest-AttachVolumeNegativeTest-server-1911310289',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1911310289',id=17,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAo9N9SBuJhclT8juo+8b12wFHt6NAGkiNJ7ehKBJpONVCEdE5A5Z07CTu/29gUxG4MbfJbnV7zB79yx1xwbUeMcly3EFdfbDUvERXf8MjB5TvVw2q1v1JpXpPJwCgl4Q==',key_name='tempest-keypair-1774007176',keypairs=,launch_index=0,launched_at=2023-04-17T17:39:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='70d10a379e4e420e9c66476ae0b10507',ramdisk_id='',reservation_id='r-3tds31ez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-469494846',owner_user_name='tempest-AttachVolumeNegativeTest-469494846-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:39:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6cbcda57441d43e0bb8dfee4768df2a8',uuid=30e45106-0ac1-4580-9945-e90c1a410e21,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.91", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converting VIF {"id": "32bf69f3-b016-42ff-967d-6d437b60953a", "address": "fa:16:3e:08:b3:d0", "network": {"id": "af0f7e21-142e-450a-9674-ea24c1cbc9aa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1104956119-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.91", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70d10a379e4e420e9c66476ae0b10507", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap32bf69f3-b0", "ovs_interfaceid": "32bf69f3-b016-42ff-967d-6d437b60953a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG os_vif [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32bf69f3-b0, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:05 user nova-compute[71628]: INFO os_vif [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:b3:d0,bridge_name='br-int',has_traffic_filtering=True,id=32bf69f3-b016-42ff-967d-6d437b60953a,network=Network(af0f7e21-142e-450a-9674-ea24c1cbc9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32bf69f3-b0') Apr 17 17:41:05 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Deleting instance files /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21_del Apr 17 17:41:05 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Deletion of /opt/stack/data/nova/instances/30e45106-0ac1-4580-9945-e90c1a410e21_del complete Apr 17 17:41:05 user nova-compute[71628]: INFO nova.compute.manager [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Took 0.71 seconds to destroy the instance on the hypervisor. Apr 17 17:41:05 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:41:05 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:06 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Took 0.72 seconds to deallocate network for instance. Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.compute.manager [req-26b04a0c-7ccb-47a2-a90c-daa66f226913 req-1d645320-af94-4b17-8027-a6d0cfca154e service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-deleted-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:06 user nova-compute[71628]: INFO nova.compute.manager [req-26b04a0c-7ccb-47a2-a90c-daa66f226913 req-1d645320-af94-4b17-8027-a6d0cfca154e service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Neutron deleted interface 32bf69f3-b016-42ff-967d-6d437b60953a; detaching it from the instance and deleting it from the info cache Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.network.neutron [req-26b04a0c-7ccb-47a2-a90c-daa66f226913 req-1d645320-af94-4b17-8027-a6d0cfca154e service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.compute.manager [req-26b04a0c-7ccb-47a2-a90c-daa66f226913 req-1d645320-af94-4b17-8027-a6d0cfca154e service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Detach interface failed, port_id=32bf69f3-b016-42ff-967d-6d437b60953a, reason: Instance 30e45106-0ac1-4580-9945-e90c1a410e21 could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:41:06 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:06 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Deleted allocations for instance 30e45106-0ac1-4580-9945-e90c1a410e21 Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-d7369489-9edb-4bd8-8cfd-5fd8988bedf6 tempest-AttachVolumeNegativeTest-469494846 tempest-AttachVolumeNegativeTest-469494846-project-member] Lock "30e45106-0ac1-4580-9945-e90c1a410e21" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.830s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] No waiting events found dispatching network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:41:07 user nova-compute[71628]: WARNING nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received unexpected event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a for instance with vm_state deleted and task_state None. Apr 17 17:41:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Acquiring lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] Lock "30e45106-0ac1-4580-9945-e90c1a410e21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] No waiting events found dispatching network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:41:07 user nova-compute[71628]: WARNING nova.compute.manager [req-4477cc00-4cd8-4ca1-8427-ddab4cc320b7 req-c7a9df87-a7b0-4389-bc11-dbc92ffeb4f9 service nova] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Received unexpected event network-vif-plugged-32bf69f3-b016-42ff-967d-6d437b60953a for instance with vm_state deleted and task_state None. Apr 17 17:41:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:41:20 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] VM Stopped (Lifecycle Event) Apr 17 17:41:20 user nova-compute[71628]: DEBUG nova.compute.manager [None req-89ade828-1119-4739-ac3a-fd867ca0bf51 None None] [instance: 30e45106-0ac1-4580-9945-e90c1a410e21] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "5b441af0-f88c-4a64-a17a-662ea297a162" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "5b441af0-f88c-4a64-a17a-662ea297a162" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:41:28 user nova-compute[71628]: INFO nova.compute.claims [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Claim successful on node user Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:41:28 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:41:28 user nova-compute[71628]: INFO nova.virt.block_device [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Booting with volume-backed-image 82e42adf-a9f9-4d9b-9bd0-106a738b1690 at /dev/vda Apr 17 17:41:28 user nova-compute[71628]: DEBUG nova.policy [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3fc1bd85d647d7b1eabca4bf49d42f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63c872fffe164507ab615963a791bfb9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:41:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:41:29 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Successfully created port: 5aaacbb0-f013-49f3-9c52-854a0658f61b {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Successfully updated port: 5aaacbb0-f013-49f3-9c52-854a0658f61b {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquired lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Received event network-changed-5aaacbb0-f013-49f3-9c52-854a0658f61b {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Refreshing instance network info cache due to event network-changed-5aaacbb0-f013-49f3-9c52-854a0658f61b. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] Acquiring lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Updating instance_info_cache with network_info: [{"id": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "address": "fa:16:3e:84:87:50", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aaacbb0-f0", "ovs_interfaceid": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Releasing lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Instance network_info: |[{"id": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "address": "fa:16:3e:84:87:50", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aaacbb0-f0", "ovs_interfaceid": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] Acquired lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG nova.network.neutron [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Refreshing network info cache for port 5aaacbb0-f013-49f3-9c52-854a0658f61b {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:41:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG nova.network.neutron [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Updated VIF entry in instance network info cache for port 5aaacbb0-f013-49f3-9c52-854a0658f61b. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG nova.network.neutron [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Updating instance_info_cache with network_info: [{"id": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "address": "fa:16:3e:84:87:50", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aaacbb0-f0", "ovs_interfaceid": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-6b6c1390-cd81-4c05-a0a8-70dea65472d3 req-e0f91455-4d46-4fb8-9456-8fa4f96ce536 service nova] Releasing lock "refresh_cache-5b441af0-f88c-4a64-a17a-662ea297a162" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:31 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:41:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:41:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8745MB free_disk=26.410980224609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 9149e658-c82c-4562-be77-ce741c7cd48e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 5b441af0-f88c-4a64-a17a-662ea297a162 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:41:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:41:33 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:41:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:41:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:41:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.375s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:34 user nova-compute[71628]: WARNING nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Volume id: 07418bc0-d312-4571-ae5f-fabddf0dc439 finished being created but its status is error. Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Traceback (most recent call last): Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] driver_block_device.attach_block_devices( Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] _log_and_attach(device) Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] bdm.attach(*attach_args, **attach_kwargs) Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] self.volume_id, self.attachment_id = self._create_volume( Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] with excutils.save_and_reraise_exception(): Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] self.force_reraise() Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] raise self.value Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] wait_func(context, volume_id) Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] nova.exception.VolumeNotCreated: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.claims [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Aborting claim: {{(pid=71628) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.256s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Build of instance 5b441af0-f88c-4a64-a17a-662ea297a162 aborted: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.utils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Build of instance 5b441af0-f88c-4a64-a17a-662ea297a162 aborted: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71628) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 17 17:41:34 user nova-compute[71628]: ERROR nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Build of instance 5b441af0-f88c-4a64-a17a-662ea297a162 aborted: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 5b441af0-f88c-4a64-a17a-662ea297a162 aborted: Volume 07418bc0-d312-4571-ae5f-fabddf0dc439 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Unplugging VIFs for instance {{(pid=71628) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:41:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1636052485',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1636052485',id=18,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-ymdzb650',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:41:29Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=5b441af0-f88c-4a64-a17a-662ea297a162,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "address": "fa:16:3e:84:87:50", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aaacbb0-f0", "ovs_interfaceid": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "address": "fa:16:3e:84:87:50", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aaacbb0-f0", "ovs_interfaceid": "5aaacbb0-f013-49f3-9c52-854a0658f61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:87:50,bridge_name='br-int',has_traffic_filtering=True,id=5aaacbb0-f013-49f3-9c52-854a0658f61b,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aaacbb0-f0') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG os_vif [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:87:50,bridge_name='br-int',has_traffic_filtering=True,id=5aaacbb0-f013-49f3-9c52-854a0658f61b,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aaacbb0-f0') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5aaacbb0-f0, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:41:34 user nova-compute[71628]: INFO os_vif [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:87:50,bridge_name='br-int',has_traffic_filtering=True,id=5aaacbb0-f013-49f3-9c52-854a0658f61b,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aaacbb0-f0') Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Unplugged VIFs for instance {{(pid=71628) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:41:34 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG nova.network.neutron [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:41:35 user nova-compute[71628]: INFO nova.compute.manager [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 5b441af0-f88c-4a64-a17a-662ea297a162] Took 0.65 seconds to deallocate network for instance. Apr 17 17:41:35 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Deleted allocations for instance 5b441af0-f88c-4a64-a17a-662ea297a162 Apr 17 17:41:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-3fa74bee-25b5-4650-957f-4d06eb88ee7f tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "5b441af0-f88c-4a64-a17a-662ea297a162" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.188s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 1294-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:41:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:45 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:41:50 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:55 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:41:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:00 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:08 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:08 user nova-compute[71628]: INFO nova.compute.manager [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Terminating instance Apr 17 17:42:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.compute.manager [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-unplugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.compute.manager [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-unplugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.compute.manager [req-51a89c83-c839-4156-9875-48f8a4c57dab req-307ab286-5a01-4c10-9a98-5777decf7ae6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-unplugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Instance destroyed successfully. Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.objects.instance [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lazy-loading 'resources' on Instance uuid b5fd68bf-3827-41f7-9ffa-ce1060e95f58 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-720085354',display_name='tempest-ServersNegativeTestJSON-server-720085354',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-720085354',id=1,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:33:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b5413283bcdd4120a73a64d76459853a',ramdisk_id='',reservation_id='r-3uq27e85',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1842710030',owner_user_name='tempest-ServersNegativeTestJSON-1842710030-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:33:51Z,user_data=None,user_id='33f713b19cdf41bc9d56ee7cea3722ab',uuid=b5fd68bf-3827-41f7-9ffa-ce1060e95f58,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converting VIF {"id": "dcd09a73-2587-46b6-95cc-57f1505c9993", "address": "fa:16:3e:46:8f:aa", "network": {"id": "19875ca0-5cb0-4629-aee7-43ab51e714bb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-24618206-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b5413283bcdd4120a73a64d76459853a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdcd09a73-25", "ovs_interfaceid": "dcd09a73-2587-46b6-95cc-57f1505c9993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG os_vif [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd09a73-25, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:09 user nova-compute[71628]: INFO os_vif [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8f:aa,bridge_name='br-int',has_traffic_filtering=True,id=dcd09a73-2587-46b6-95cc-57f1505c9993,network=Network(19875ca0-5cb0-4629-aee7-43ab51e714bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdcd09a73-25') Apr 17 17:42:09 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Deleting instance files /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58_del Apr 17 17:42:09 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Deletion of /opt/stack/data/nova/instances/b5fd68bf-3827-41f7-9ffa-ce1060e95f58_del complete Apr 17 17:42:09 user nova-compute[71628]: INFO nova.compute.manager [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 17 17:42:09 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:42:09 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:42:10 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Took 0.99 seconds to deallocate network for instance. Apr 17 17:42:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:42:10 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.188s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Deleted allocations for instance b5fd68bf-3827-41f7-9ffa-ce1060e95f58 Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-1cfa39eb-5158-403e-bfb6-1d089b355d27 tempest-ServersNegativeTestJSON-1842710030 tempest-ServersNegativeTestJSON-1842710030-project-member] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.195s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:11 user nova-compute[71628]: WARNING nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state deleted and task_state None. Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:11 user nova-compute[71628]: WARNING nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state deleted and task_state None. Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:11 user nova-compute[71628]: WARNING nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state deleted and task_state None. Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Acquiring lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] Lock "b5fd68bf-3827-41f7-9ffa-ce1060e95f58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] No waiting events found dispatching network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:11 user nova-compute[71628]: WARNING nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received unexpected event network-vif-plugged-dcd09a73-2587-46b6-95cc-57f1505c9993 for instance with vm_state deleted and task_state None. Apr 17 17:42:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-1808c434-eb09-4fb9-9d97-f91a3ed95876 req-e537fff6-4f79-404e-8814-3516d692edd6 service nova] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Received event network-vif-deleted-dcd09a73-2587-46b6-95cc-57f1505c9993 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:19 user nova-compute[71628]: INFO nova.compute.manager [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Terminating instance Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-unplugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-unplugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-68e62ba3-6bad-4e0e-b6f9-6de1c3fd2616 req-dedfeb5d-42e9-4e22-8a06-38337f3be2dd service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-unplugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Instance destroyed successfully. Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.objects.instance [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'resources' on Instance uuid 9149e658-c82c-4562-be77-ce741c7cd48e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:38:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1334242229',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1334242229',id=16,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:38:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-r3edgp4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:40:26Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=9149e658-c82c-4562-be77-ce741c7cd48e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "address": "fa:16:3e:71:2b:d3", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0bc315-e7", "ovs_interfaceid": "3b0bc315-e7d4-4753-b2aa-490ef430bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG os_vif [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b0bc315-e7, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:42:19 user nova-compute[71628]: INFO os_vif [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2b:d3,bridge_name='br-int',has_traffic_filtering=True,id=3b0bc315-e7d4-4753-b2aa-490ef430bec1,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0bc315-e7') Apr 17 17:42:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Deleting instance files /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e_del Apr 17 17:42:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Deletion of /opt/stack/data/nova/instances/9149e658-c82c-4562-be77-ce741c7cd48e_del complete Apr 17 17:42:19 user nova-compute[71628]: INFO nova.compute.manager [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 17 17:42:19 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:42:20 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Took 0.47 seconds to deallocate network for instance. Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-4bfcaab5-aa77-43ee-908c-4e0364b754de req-054084f8-4d29-4ab0-9e41-507805766387 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-deleted-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:20 user nova-compute[71628]: INFO nova.compute.manager [req-4bfcaab5-aa77-43ee-908c-4e0364b754de req-054084f8-4d29-4ab0-9e41-507805766387 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Neutron deleted interface 3b0bc315-e7d4-4753-b2aa-490ef430bec1; detaching it from the instance and deleting it from the info cache Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.network.neutron [req-4bfcaab5-aa77-43ee-908c-4e0364b754de req-054084f8-4d29-4ab0-9e41-507805766387 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.compute.manager [req-4bfcaab5-aa77-43ee-908c-4e0364b754de req-054084f8-4d29-4ab0-9e41-507805766387 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Detach interface failed, port_id=3b0bc315-e7d4-4753-b2aa-490ef430bec1, reason: Instance 9149e658-c82c-4562-be77-ce741c7cd48e could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:42:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:20 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Deleted allocations for instance 9149e658-c82c-4562-be77-ce741c7cd48e Apr 17 17:42:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-cf701306-8e28-47d5-b6ca-49adadb27c39 tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "9149e658-c82c-4562-be77-ce741c7cd48e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.474s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:21 user nova-compute[71628]: WARNING nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state deleted and task_state None. Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:21 user nova-compute[71628]: WARNING nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state deleted and task_state None. Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:21 user nova-compute[71628]: WARNING nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state deleted and task_state None. Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Acquiring lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] Lock "9149e658-c82c-4562-be77-ce741c7cd48e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:21 user nova-compute[71628]: DEBUG nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] No waiting events found dispatching network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:42:21 user nova-compute[71628]: WARNING nova.compute.manager [req-5309ce08-db78-477e-aa96-d2da6923fc57 req-786d58d0-bff7-4035-b7a4-a5cc85566863 service nova] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Received unexpected event network-vif-plugged-3b0bc315-e7d4-4753-b2aa-490ef430bec1 for instance with vm_state deleted and task_state None. Apr 17 17:42:24 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:42:24 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] VM Stopped (Lifecycle Event) Apr 17 17:42:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:24 user nova-compute[71628]: DEBUG nova.compute.manager [None req-7becbe87-14b1-47db-95be-565156adce61 None None] [instance: b5fd68bf-3827-41f7-9ffa-ce1060e95f58] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:42:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:42:32 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:42:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:42:33 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8922MB free_disk=26.467029571533203GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 337c511a-a2ed-484e-ab48-31618fa2755e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 47d30b1a-fc08-4cad-8a2e-003b43251518 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:42:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:42:34 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:42:34 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] VM Stopped (Lifecycle Event) Apr 17 17:42:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-08e38035-4fa0-41d3-aa61-da7c9f7b0778 None None] [instance: 9149e658-c82c-4562-be77-ce741c7cd48e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:42:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid 337c511a-a2ed-484e-ab48-31618fa2755e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [{"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-337c511a-a2ed-484e-ab48-31618fa2755e" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:42:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:42:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:42:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:42:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:42:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:42:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:00 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:09 user nova-compute[71628]: INFO nova.compute.manager [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Terminating instance Apr 17 17:43:09 user nova-compute[71628]: DEBUG nova.compute.manager [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.compute.manager [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-unplugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.compute.manager [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-unplugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.compute.manager [req-62b236e7-deb1-43cf-859a-acde93c4d9bd req-e59bcfc4-9737-44c2-bac4-d17349b8bcf6 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-unplugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Instance destroyed successfully. Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.objects.instance [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lazy-loading 'resources' on Instance uuid 337c511a-a2ed-484e-ab48-31618fa2755e {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1097947059',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1097947059',id=8,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:34:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='63c872fffe164507ab615963a791bfb9',ramdisk_id='',reservation_id='r-rsq1vb6x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:36:37Z,user_data=None,user_id='4d3fc1bd85d647d7b1eabca4bf49d42f',uuid=337c511a-a2ed-484e-ab48-31618fa2755e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converting VIF {"id": "25797ef5-1350-4d57-bd16-5c59918ca955", "address": "fa:16:3e:1c:1a:f5", "network": {"id": "30e36505-103b-4c7d-8408-02de3c5258b5", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1865747356-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "63c872fffe164507ab615963a791bfb9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap25797ef5-13", "ovs_interfaceid": "25797ef5-1350-4d57-bd16-5c59918ca955", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG os_vif [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25797ef5-13, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:10 user nova-compute[71628]: INFO os_vif [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:1a:f5,bridge_name='br-int',has_traffic_filtering=True,id=25797ef5-1350-4d57-bd16-5c59918ca955,network=Network(30e36505-103b-4c7d-8408-02de3c5258b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25797ef5-13') Apr 17 17:43:10 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Deleting instance files /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e_del Apr 17 17:43:10 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Deletion of /opt/stack/data/nova/instances/337c511a-a2ed-484e-ab48-31618fa2755e_del complete Apr 17 17:43:10 user nova-compute[71628]: INFO nova.compute.manager [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 17 17:43:10 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:43:11 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Took 0.52 seconds to deallocate network for instance. Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-cefb82e9-f10d-4076-bb71-e761b9505e35 req-0d633014-6cd1-4cac-a1a5-9bff28b42414 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-deleted-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:11 user nova-compute[71628]: INFO nova.compute.manager [req-cefb82e9-f10d-4076-bb71-e761b9505e35 req-0d633014-6cd1-4cac-a1a5-9bff28b42414 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Neutron deleted interface 25797ef5-1350-4d57-bd16-5c59918ca955; detaching it from the instance and deleting it from the info cache Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.network.neutron [req-cefb82e9-f10d-4076-bb71-e761b9505e35 req-0d633014-6cd1-4cac-a1a5-9bff28b42414 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.compute.manager [req-cefb82e9-f10d-4076-bb71-e761b9505e35 req-0d633014-6cd1-4cac-a1a5-9bff28b42414 service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Detach interface failed, port_id=25797ef5-1350-4d57-bd16-5c59918ca955, reason: Instance 337c511a-a2ed-484e-ab48-31618fa2755e could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:43:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:11 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Deleted allocations for instance 337c511a-a2ed-484e-ab48-31618fa2755e Apr 17 17:43:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-54b6e866-3408-47c1-9a9f-ae41d60ec6ec tempest-ServerBootFromVolumeStableRescueTest-1793110919 tempest-ServerBootFromVolumeStableRescueTest-1793110919-project-member] Lock "337c511a-a2ed-484e-ab48-31618fa2755e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.585s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:12 user nova-compute[71628]: WARNING nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state deleted and task_state None. Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:12 user nova-compute[71628]: WARNING nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state deleted and task_state None. Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:12 user nova-compute[71628]: WARNING nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state deleted and task_state None. Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Acquiring lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] Lock "337c511a-a2ed-484e-ab48-31618fa2755e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] No waiting events found dispatching network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:12 user nova-compute[71628]: WARNING nova.compute.manager [req-09b1e279-48b5-4b9b-955d-a65fdcc97a5f req-6a611eec-8602-4445-a969-d3b81824257d service nova] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Received unexpected event network-vif-plugged-25797ef5-1350-4d57-bd16-5c59918ca955 for instance with vm_state deleted and task_state None. Apr 17 17:43:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:43:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:25 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:43:25 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] VM Stopped (Lifecycle Event) Apr 17 17:43:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-80371e92-6455-489d-9a08-16a077bcf8d4 None None] [instance: 337c511a-a2ed-484e-ab48-31618fa2755e] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:43:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:29 user nova-compute[71628]: INFO nova.compute.manager [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Terminating instance Apr 17 17:43:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-unplugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] No waiting events found dispatching network-vif-unplugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.compute.manager [req-8811c649-36ee-463f-b7e9-d8fdc0ace0a9 req-06aa98c3-5f63-4cce-b48b-ed78b1c3b4b0 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-unplugged-653604b7-8213-4fd3-a733-26a32725aae2 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:30 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Instance destroyed successfully. Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.objects.instance [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lazy-loading 'resources' on Instance uuid 47d30b1a-fc08-4cad-8a2e-003b43251518 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-652405357',display_name='tempest-ServerActionsTestJSON-server-652405357',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-652405357',id=10,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBKLOGr6Xl2ayt6JPa/BTov3dZz2x2DRGJJ0beWQ98scecwsWPA9PwlSpVMqk4AmV4xFimhLplkR3dpkkRnqF2vN+gOPfqvdsMSgfgxOtyYvO9m7kepkdN/F/4cbYypkVA==',key_name='tempest-keypair-1675299659',keypairs=,launch_index=0,launched_at=2023-04-17T17:34:48Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6952e4623737462a8b8f31ada0786922',ramdisk_id='',reservation_id='r-dvlv99yl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1425889987',owner_user_name='tempest-ServerActionsTestJSON-1425889987-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:34:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='8d573b61dc994a1fa6343b162ac67112',uuid=47d30b1a-fc08-4cad-8a2e-003b43251518,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converting VIF {"id": "653604b7-8213-4fd3-a733-26a32725aae2", "address": "fa:16:3e:02:a7:8c", "network": {"id": "c464cb4e-a191-4c7d-9110-f0fb81a3b9aa", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1783366923-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6952e4623737462a8b8f31ada0786922", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap653604b7-82", "ovs_interfaceid": "653604b7-8213-4fd3-a733-26a32725aae2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG os_vif [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap653604b7-82, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:30 user nova-compute[71628]: INFO os_vif [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:a7:8c,bridge_name='br-int',has_traffic_filtering=True,id=653604b7-8213-4fd3-a733-26a32725aae2,network=Network(c464cb4e-a191-4c7d-9110-f0fb81a3b9aa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653604b7-82') Apr 17 17:43:30 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Deleting instance files /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518_del Apr 17 17:43:30 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Deletion of /opt/stack/data/nova/instances/47d30b1a-fc08-4cad-8a2e-003b43251518_del complete Apr 17 17:43:30 user nova-compute[71628]: INFO nova.compute.manager [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 17 17:43:30 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:43:30 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:43:31 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Took 0.80 seconds to deallocate network for instance. Apr 17 17:43:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:43:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:31 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Deleted allocations for instance 47d30b1a-fc08-4cad-8a2e-003b43251518 Apr 17 17:43:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-7192e601-babb-473d-9416-d41e6006ec58 tempest-ServerActionsTestJSON-1425889987 tempest-ServerActionsTestJSON-1425889987-project-member] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.945s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] Acquiring lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] Lock "47d30b1a-fc08-4cad-8a2e-003b43251518-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] No waiting events found dispatching network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:32 user nova-compute[71628]: WARNING nova.compute.manager [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received unexpected event network-vif-plugged-653604b7-8213-4fd3-a733-26a32725aae2 for instance with vm_state deleted and task_state None. Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-934c51e2-76f2-41c7-9a8d-dceb53f1d15e req-f6830fc1-074b-4455-b942-852af1319038 service nova] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Received event network-vif-deleted-653604b7-8213-4fd3-a733-26a32725aae2 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:43:32 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=9075MB free_disk=26.531360626220703GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:43:39 user nova-compute[71628]: INFO nova.compute.claims [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Claim successful on node user Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:43:39 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:43:39 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Creating image(s) Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "/opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "/opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "/opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:39 user nova-compute[71628]: DEBUG nova.policy [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d29ba758b794e849b8cb94bc76c0247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e5c56c675ef40b8b6eab0d00b46014b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.part --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.part --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG nova.virt.images [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] 223d77c4-f5da-4195-8b24-d8276adb1d0d was qcow2, converting to raw {{(pid=71628) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.part /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.converted {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Successfully created port: 811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.part /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.converted" returned: 0 in 0.307s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.converted --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f.converted --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.971s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f,backing_fmt=raw /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f,backing_fmt=raw /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk 1073741824" returned: 0 in 0.047s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "46f0412d1009acd3f8cfb915d4e001c018e3f05f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Successfully updated port: 811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquired lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.compute.manager [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-changed-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.compute.manager [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Refreshing instance network info cache due to event network-changed-811d697c-53f4-4f43-9a86-000b1ae7fdba. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] Acquiring lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/46f0412d1009acd3f8cfb915d4e001c018e3f05f --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Cannot resize image /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.objects.instance [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'migration_context' on Instance uuid f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Ensure instance console log exists: /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updating instance_info_cache with network_info: [{"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Releasing lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Instance network_info: |[{"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] Acquired lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Refreshing network info cache for port 811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Start _get_guest_xml network_info=[{"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:43:37Z,direct_url=,disk_format='qcow2',id=223d77c4-f5da-4195-8b24-d8276adb1d0d,min_disk=0,min_ram=0,name='tempest-scenario-img--482898099',owner='3e5c56c675ef40b8b6eab0d00b46014b',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:43:38Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '223d77c4-f5da-4195-8b24-d8276adb1d0d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:43:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:43:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:43:37Z,direct_url=,disk_format='qcow2',id=223d77c4-f5da-4195-8b24-d8276adb1d0d,min_disk=0,min_ram=0,name='tempest-scenario-img--482898099',owner='3e5c56c675ef40b8b6eab0d00b46014b',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:43:38Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1095186947',display_name='tempest-TestMinimumBasicScenario-server-1095186947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1095186947',id=19,image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDnzFKSp1ZsarDI2o4gPxfblQcH4Owj82sVfODprSc1K69tFajrNPplrLI8Ghc3K95vu8FaOu8iqSCcvEi1hjzNJhR2sWScBqRXgRvEliVgr7HrUvAfkkXjRzK0qtyqnyA==',key_name='tempest-TestMinimumBasicScenario-1415594726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-e6gmc4k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:43:40Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.objects.instance [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'pci_devices' on Instance uuid f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] End _get_guest_xml xml= Apr 17 17:43:41 user nova-compute[71628]: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb Apr 17 17:43:41 user nova-compute[71628]: instance-00000013 Apr 17 17:43:41 user nova-compute[71628]: 131072 Apr 17 17:43:41 user nova-compute[71628]: 1 Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: tempest-TestMinimumBasicScenario-server-1095186947 Apr 17 17:43:41 user nova-compute[71628]: 2023-04-17 17:43:41 Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: 128 Apr 17 17:43:41 user nova-compute[71628]: 1 Apr 17 17:43:41 user nova-compute[71628]: 0 Apr 17 17:43:41 user nova-compute[71628]: 0 Apr 17 17:43:41 user nova-compute[71628]: 1 Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: tempest-TestMinimumBasicScenario-145353383-project-member Apr 17 17:43:41 user nova-compute[71628]: tempest-TestMinimumBasicScenario-145353383 Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: OpenStack Foundation Apr 17 17:43:41 user nova-compute[71628]: OpenStack Nova Apr 17 17:43:41 user nova-compute[71628]: 0.0.0 Apr 17 17:43:41 user nova-compute[71628]: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb Apr 17 17:43:41 user nova-compute[71628]: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb Apr 17 17:43:41 user nova-compute[71628]: Virtual Machine Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: hvm Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Nehalem Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: /dev/urandom Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: Apr 17 17:43:41 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1095186947',display_name='tempest-TestMinimumBasicScenario-server-1095186947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1095186947',id=19,image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDnzFKSp1ZsarDI2o4gPxfblQcH4Owj82sVfODprSc1K69tFajrNPplrLI8Ghc3K95vu8FaOu8iqSCcvEi1hjzNJhR2sWScBqRXgRvEliVgr7HrUvAfkkXjRzK0qtyqnyA==',key_name='tempest-TestMinimumBasicScenario-1415594726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-e6gmc4k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:43:40Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG os_vif [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap811d697c-53, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap811d697c-53, col_values=(('external_ids', {'iface-id': '811d697c-53f4-4f43-9a86-000b1ae7fdba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:d0:b4', 'vm-uuid': 'f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:41 user nova-compute[71628]: INFO os_vif [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] No VIF found with MAC fa:16:3e:01:d0:b4, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updated VIF entry in instance network info cache for port 811d697c-53f4-4f43-9a86-000b1ae7fdba. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG nova.network.neutron [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updating instance_info_cache with network_info: [{"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:43:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-f36ba876-7f8f-4da8-9cdb-ccdd38aa95d5 req-0600a4ef-3f6c-43b5-bb83-2f94814664b6 service nova] Releasing lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:43:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG nova.compute.manager [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG nova.compute.manager [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] No waiting events found dispatching network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:43 user nova-compute[71628]: WARNING nova.compute.manager [req-761e0202-9de6-4020-a257-a99ae3dea527 req-df519045-e0b9-4d8d-aea1-e9578af2b1fe service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received unexpected event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba for instance with vm_state building and task_state spawning. Apr 17 17:43:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:43:44 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] VM Resumed (Lifecycle Event) Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:43:44 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Instance spawned successfully. Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:43:44 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:43:44 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:43:44 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] VM Started (Lifecycle Event) Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:43:45 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:43:45 user nova-compute[71628]: INFO nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Took 5.40 seconds to spawn the instance on the hypervisor. Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:43:45 user nova-compute[71628]: INFO nova.compute.manager [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Took 5.92 seconds to build instance. Apr 17 17:43:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-86be0b74-4de6-48fc-b1da-167ad2f360b8 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.008s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] No waiting events found dispatching network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:43:45 user nova-compute[71628]: WARNING nova.compute.manager [req-3cc42a2b-ca53-4bd3-825a-2cfe2c66c055 req-136e58b4-598b-445b-ab60-0624b4c45726 service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received unexpected event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba for instance with vm_state active and task_state None. Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:43:45 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] VM Stopped (Lifecycle Event) Apr 17 17:43:45 user nova-compute[71628]: DEBUG nova.compute.manager [None req-c3017b9a-a515-4379-8e52-a80e414c30f2 None None] [instance: 47d30b1a-fc08-4cad-8a2e-003b43251518] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:43:46 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:51 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:56 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:43:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:01 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:11 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:11 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:44:12 user nova-compute[71628]: INFO nova.compute.claims [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Claim successful on node user Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:44:12 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.policy [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d22aee4776b4ae89ca19af5ce976d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c9cdf67684764421af28a1cd43efcf0b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:44:12 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Creating image(s) Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "/opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk 1073741824" returned: 0 in 0.052s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.205s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.178s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Checking if we can resize image /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:44:12 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Successfully created port: 558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Cannot resize image /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.objects.instance [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'migration_context' on Instance uuid 34582c99-56bf-44e5-adca-a9883318afa0 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:44:13 user nova-compute[71628]: INFO nova.compute.claims [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Claim successful on node user Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Ensure instance console log exists: /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:44:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.policy [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8d22aee4776b4ae89ca19af5ce976d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c9cdf67684764421af28a1cd43efcf0b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:44:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Creating image(s) Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk 1073741824" returned: 0 in 0.050s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:13 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Successfully updated port: 558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquired lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-changed-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Refreshing instance network info cache due to event network-changed-558be61b-7179-45ab-9796-160aa6bb3e86. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] Acquiring lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Checking if we can resize image /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Successfully created port: b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Cannot resize image /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'migration_context' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Ensure instance console log exists: /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updating instance_info_cache with network_info: [{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Releasing lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Instance network_info: |[{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] Acquired lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Refreshing network info cache for port 558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Start _get_guest_xml network_info=[{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:44:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:14 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2116812969',display_name='tempest-ServerRescueNegativeTestJSON-server-2116812969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2116812969',id=20,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-gp7vg9vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:12Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=34582c99-56bf-44e5-adca-a9883318afa0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'pci_devices' on Instance uuid 34582c99-56bf-44e5-adca-a9883318afa0 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] End _get_guest_xml xml= Apr 17 17:44:14 user nova-compute[71628]: 34582c99-56bf-44e5-adca-a9883318afa0 Apr 17 17:44:14 user nova-compute[71628]: instance-00000014 Apr 17 17:44:14 user nova-compute[71628]: 131072 Apr 17 17:44:14 user nova-compute[71628]: 1 Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-server-2116812969 Apr 17 17:44:14 user nova-compute[71628]: 2023-04-17 17:44:14 Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: 128 Apr 17 17:44:14 user nova-compute[71628]: 1 Apr 17 17:44:14 user nova-compute[71628]: 0 Apr 17 17:44:14 user nova-compute[71628]: 0 Apr 17 17:44:14 user nova-compute[71628]: 1 Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867-project-member Apr 17 17:44:14 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867 Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: OpenStack Foundation Apr 17 17:44:14 user nova-compute[71628]: OpenStack Nova Apr 17 17:44:14 user nova-compute[71628]: 0.0.0 Apr 17 17:44:14 user nova-compute[71628]: 34582c99-56bf-44e5-adca-a9883318afa0 Apr 17 17:44:14 user nova-compute[71628]: 34582c99-56bf-44e5-adca-a9883318afa0 Apr 17 17:44:14 user nova-compute[71628]: Virtual Machine Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: hvm Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Nehalem Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: /dev/urandom Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: Apr 17 17:44:14 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2116812969',display_name='tempest-ServerRescueNegativeTestJSON-server-2116812969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2116812969',id=20,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-gp7vg9vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:12Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=34582c99-56bf-44e5-adca-a9883318afa0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG os_vif [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap558be61b-71, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap558be61b-71, col_values=(('external_ids', {'iface-id': '558be61b-7179-45ab-9796-160aa6bb3e86', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:fa:96', 'vm-uuid': '34582c99-56bf-44e5-adca-a9883318afa0'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:14 user nova-compute[71628]: INFO os_vif [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No VIF found with MAC fa:16:3e:08:fa:96, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updated VIF entry in instance network info cache for port 558be61b-7179-45ab-9796-160aa6bb3e86. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updating instance_info_cache with network_info: [{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-470a3482-3213-487a-8c1f-5cc065c36d28 req-f12ed5d4-353e-4b15-8a0a-a4828118e207 service nova] Releasing lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Successfully updated port: b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquired lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-changed-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Refreshing instance network info cache due to event network-changed-b064deb8-b9d4-483a-9f33-beb3dbfd48af. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] Acquiring lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Releasing lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance network_info: |[{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] Acquired lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Refreshing network info cache for port b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start _get_guest_xml network_info=[{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:44:15 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:15 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2025827501',display_name='tempest-ServerRescueNegativeTestJSON-server-2025827501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2025827501',id=21,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-l5cux0tk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:14Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=335e8c98-e4f3-4486-8f21-b24096d97d71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.objects.instance [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'pci_devices' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] End _get_guest_xml xml= Apr 17 17:44:15 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:44:15 user nova-compute[71628]: instance-00000015 Apr 17 17:44:15 user nova-compute[71628]: 131072 Apr 17 17:44:15 user nova-compute[71628]: 1 Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-server-2025827501 Apr 17 17:44:15 user nova-compute[71628]: 2023-04-17 17:44:15 Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: 128 Apr 17 17:44:15 user nova-compute[71628]: 1 Apr 17 17:44:15 user nova-compute[71628]: 0 Apr 17 17:44:15 user nova-compute[71628]: 0 Apr 17 17:44:15 user nova-compute[71628]: 1 Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867-project-member Apr 17 17:44:15 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867 Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: OpenStack Foundation Apr 17 17:44:15 user nova-compute[71628]: OpenStack Nova Apr 17 17:44:15 user nova-compute[71628]: 0.0.0 Apr 17 17:44:15 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:44:15 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:44:15 user nova-compute[71628]: Virtual Machine Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: hvm Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Nehalem Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: /dev/urandom Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: Apr 17 17:44:15 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2025827501',display_name='tempest-ServerRescueNegativeTestJSON-server-2025827501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2025827501',id=21,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-l5cux0tk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:14Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=335e8c98-e4f3-4486-8f21-b24096d97d71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG os_vif [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb064deb8-b9, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb064deb8-b9, col_values=(('external_ids', {'iface-id': 'b064deb8-b9d4-483a-9f33-beb3dbfd48af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:67:b8', 'vm-uuid': '335e8c98-e4f3-4486-8f21-b24096d97d71'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:15 user nova-compute[71628]: INFO os_vif [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No VIF found with MAC fa:16:3e:1e:67:b8, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updated VIF entry in instance network info cache for port b064deb8-b9d4-483a-9f33-beb3dbfd48af. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG nova.network.neutron [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2336f2e8-1e64-40f1-b162-3b813dff9792 req-9020ad4b-d518-43c8-a14d-200f640ee298 service nova] Releasing lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:15 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] No waiting events found dispatching network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:16 user nova-compute[71628]: WARNING nova.compute.manager [req-aad88bfa-10d7-4f9e-ac26-bfc51696c4bc req-df3c4ed7-b319-4d06-b89b-39fd774a6b7e service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received unexpected event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 for instance with vm_state building and task_state spawning. Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:16 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:17 user nova-compute[71628]: WARNING nova.compute.manager [req-158e6a3d-9996-49bc-9875-e8433556766d req-e72235f1-8528-4085-91b7-603b3c0e7623 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state building and task_state spawning. Apr 17 17:44:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:18 user nova-compute[71628]: DEBUG nova.compute.manager [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] No waiting events found dispatching network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:18 user nova-compute[71628]: WARNING nova.compute.manager [req-57646c56-23f0-47cf-a4c6-4199ba9a22d1 req-175fa393-f1e3-4cda-9b30-9bbbe37c1065 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received unexpected event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 for instance with vm_state building and task_state spawning. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] VM Resumed (Lifecycle Event) Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Instance spawned successfully. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] VM Started (Lifecycle Event) Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Took 6.83 seconds to spawn the instance on the hypervisor. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Took 7.37 seconds to build instance. Apr 17 17:44:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-5b385e2d-088e-484b-9b89-991f8253d500 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.470s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] VM Resumed (Lifecycle Event) Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance spawned successfully. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] VM Started (Lifecycle Event) Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Took 6.16 seconds to spawn the instance on the hypervisor. Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:19 user nova-compute[71628]: WARNING nova.compute.manager [req-8b82845f-38e9-4fe5-946a-72278cb9770c req-4a2d3f5f-c06f-44b2-b34e-1c1a57a44c90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state building and task_state spawning. Apr 17 17:44:19 user nova-compute[71628]: INFO nova.compute.manager [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Took 6.70 seconds to build instance. Apr 17 17:44:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0f22271a-ea12-464e-a2f9-6d39907b0d47 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.810s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:20 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:29 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 17:44:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] There are 0 instances to clean {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 17:44:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:32 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:34 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:44:35 user nova-compute[71628]: INFO nova.compute.claims [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Claim successful on node user Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:35 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:35 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8703MB free_disk=26.420635223388672GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.602s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.188s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:44:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 18c31df4-f883-4d7b-9ed1-1b99e77eb631 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.policy [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d4aee23bae743f19bdf6f991e044587', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cd3bfc1c102a47ff9efab5cb9a78021e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:44:35 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Creating image(s) Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "/opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "/opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "/opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:44:35 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.360s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.146s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk 1073741824" returned: 0 in 0.054s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Checking if we can resize image /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Cannot resize image /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.objects.instance [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'migration_context' on Instance uuid 18c31df4-f883-4d7b-9ed1-1b99e77eb631 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Ensure instance console log exists: /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:36 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Successfully created port: 112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Successfully updated port: 112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquired lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-changed-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG nova.compute.manager [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Refreshing instance network info cache due to event network-changed-112d73e4-daf7-4ba3-b282-34913fbb70b1. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] Acquiring lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:37 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Skipping network cache update for instance because it is Building. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updating instance_info_cache with network_info: [{"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Releasing lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Instance network_info: |[{"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] Acquired lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Refreshing network info cache for port 112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Start _get_guest_xml network_info=[{"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:44:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-840531982',display_name='tempest-AttachVolumeTestJSON-server-840531982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-840531982',id=22,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE9znVyh2XvVgB/lD9Kekstd6of4EWCi4jOx1ZRC3TdN6cXmpezDHHLnBhMDPrdXw4ZtdwembcGb/X4HTEO4jnGIfp0e3mZJuw1pVWzt7I7Vt1a1IH8RGm/LXG+ZjoqIJg==',key_name='tempest-keypair-1415434273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-xmyeswa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=18c31df4-f883-4d7b-9ed1-1b99e77eb631,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.objects.instance [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'pci_devices' on Instance uuid 18c31df4-f883-4d7b-9ed1-1b99e77eb631 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] End _get_guest_xml xml= Apr 17 17:44:38 user nova-compute[71628]: 18c31df4-f883-4d7b-9ed1-1b99e77eb631 Apr 17 17:44:38 user nova-compute[71628]: instance-00000016 Apr 17 17:44:38 user nova-compute[71628]: 131072 Apr 17 17:44:38 user nova-compute[71628]: 1 Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: tempest-AttachVolumeTestJSON-server-840531982 Apr 17 17:44:38 user nova-compute[71628]: 2023-04-17 17:44:38 Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: 128 Apr 17 17:44:38 user nova-compute[71628]: 1 Apr 17 17:44:38 user nova-compute[71628]: 0 Apr 17 17:44:38 user nova-compute[71628]: 0 Apr 17 17:44:38 user nova-compute[71628]: 1 Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: tempest-AttachVolumeTestJSON-2102743292-project-member Apr 17 17:44:38 user nova-compute[71628]: tempest-AttachVolumeTestJSON-2102743292 Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: OpenStack Foundation Apr 17 17:44:38 user nova-compute[71628]: OpenStack Nova Apr 17 17:44:38 user nova-compute[71628]: 0.0.0 Apr 17 17:44:38 user nova-compute[71628]: 18c31df4-f883-4d7b-9ed1-1b99e77eb631 Apr 17 17:44:38 user nova-compute[71628]: 18c31df4-f883-4d7b-9ed1-1b99e77eb631 Apr 17 17:44:38 user nova-compute[71628]: Virtual Machine Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: hvm Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Nehalem Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: /dev/urandom Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: Apr 17 17:44:38 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-840531982',display_name='tempest-AttachVolumeTestJSON-server-840531982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-840531982',id=22,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE9znVyh2XvVgB/lD9Kekstd6of4EWCi4jOx1ZRC3TdN6cXmpezDHHLnBhMDPrdXw4ZtdwembcGb/X4HTEO4jnGIfp0e3mZJuw1pVWzt7I7Vt1a1IH8RGm/LXG+ZjoqIJg==',key_name='tempest-keypair-1415434273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-xmyeswa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=18c31df4-f883-4d7b-9ed1-1b99e77eb631,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG os_vif [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap112d73e4-da, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap112d73e4-da, col_values=(('external_ids', {'iface-id': '112d73e4-daf7-4ba3-b282-34913fbb70b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:9c:9e', 'vm-uuid': '18c31df4-f883-4d7b-9ed1-1b99e77eb631'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:38 user nova-compute[71628]: INFO os_vif [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] No VIF found with MAC fa:16:3e:e8:9c:9e, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updating instance_info_cache with network_info: [{"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances with incomplete migration {{(pid=71628) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updated VIF entry in instance network info cache for port 112d73e4-daf7-4ba3-b282-34913fbb70b1. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG nova.network.neutron [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updating instance_info_cache with network_info: [{"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:44:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1fdf4260-94b0-4bcc-842e-8a2b49dadb6f req-d835584a-18b3-4a79-a019-d80e3f184dd7 service nova] Releasing lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG nova.compute.manager [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:39 user nova-compute[71628]: DEBUG nova.compute.manager [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] No waiting events found dispatching network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:39 user nova-compute[71628]: WARNING nova.compute.manager [req-3dbf258c-8d29-4123-a1ef-8d1c99e472f1 req-aa4666b7-deb9-4823-a631-765752c601d2 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received unexpected event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 for instance with vm_state building and task_state spawning. Apr 17 17:44:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] VM Resumed (Lifecycle Event) Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Instance spawned successfully. Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] VM Started (Lifecycle Event) Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Took 5.97 seconds to spawn the instance on the hypervisor. Apr 17 17:44:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:44:41 user nova-compute[71628]: INFO nova.compute.manager [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Took 6.90 seconds to build instance. Apr 17 17:44:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-0e7090ad-4a98-47ef-8684-11a9f9da1433 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.044s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:44:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:44:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:44:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:44:42 user nova-compute[71628]: DEBUG nova.compute.manager [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] No waiting events found dispatching network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:44:42 user nova-compute[71628]: WARNING nova.compute.manager [req-781aceb3-d280-48a2-adb3-366fa6c51083 req-77653e83-d5a4-474e-b09a-6da4e338539a service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received unexpected event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 for instance with vm_state active and task_state None. Apr 17 17:44:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:44:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:44:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:44:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:45:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:45:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:45:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:31 user nova-compute[71628]: INFO nova.compute.manager [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Terminating instance Apr 17 17:45:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-unplugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] No waiting events found dispatching network-vif-unplugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:45:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-3cdcf6cf-2da8-4d7f-8939-8940f01ac85a req-ac2e3849-3417-4748-a9bc-a9b02aa453cc service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-unplugged-811d697c-53f4-4f43-9a86-000b1ae7fdba for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:32 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Instance destroyed successfully. Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.objects.instance [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'resources' on Instance uuid f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1095186947',display_name='tempest-TestMinimumBasicScenario-server-1095186947',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1095186947',id=19,image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDnzFKSp1ZsarDI2o4gPxfblQcH4Owj82sVfODprSc1K69tFajrNPplrLI8Ghc3K95vu8FaOu8iqSCcvEi1hjzNJhR2sWScBqRXgRvEliVgr7HrUvAfkkXjRzK0qtyqnyA==',key_name='tempest-TestMinimumBasicScenario-1415594726',keypairs=,launch_index=0,launched_at=2023-04-17T17:43:45Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-e6gmc4k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='223d77c4-f5da-4195-8b24-d8276adb1d0d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:43:45Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "address": "fa:16:3e:01:d0:b4", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap811d697c-53", "ovs_interfaceid": "811d697c-53f4-4f43-9a86-000b1ae7fdba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG os_vif [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap811d697c-53, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:45:32 user nova-compute[71628]: INFO os_vif [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:d0:b4,bridge_name='br-int',has_traffic_filtering=True,id=811d697c-53f4-4f43-9a86-000b1ae7fdba,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap811d697c-53') Apr 17 17:45:32 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Deleting instance files /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb_del Apr 17 17:45:32 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Deletion of /opt/stack/data/nova/instances/f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb_del complete Apr 17 17:45:32 user nova-compute[71628]: INFO nova.compute.manager [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 17 17:45:32 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:45:32 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Took 0.43 seconds to deallocate network for instance. Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-7ec6ef72-ac7a-469a-adcf-301b2ddb82e9 req-7c9d13c2-0d81-4cb3-b710-5a5190b0292a service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-deleted-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:45:32 user nova-compute[71628]: INFO nova.compute.manager [req-7ec6ef72-ac7a-469a-adcf-301b2ddb82e9 req-7c9d13c2-0d81-4cb3-b710-5a5190b0292a service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Neutron deleted interface 811d697c-53f4-4f43-9a86-000b1ae7fdba; detaching it from the instance and deleting it from the info cache Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.network.neutron [req-7ec6ef72-ac7a-469a-adcf-301b2ddb82e9 req-7c9d13c2-0d81-4cb3-b710-5a5190b0292a service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.compute.manager [req-7ec6ef72-ac7a-469a-adcf-301b2ddb82e9 req-7c9d13c2-0d81-4cb3-b710-5a5190b0292a service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Detach interface failed, port_id=811d697c-53f4-4f43-9a86-000b1ae7fdba, reason: Instance f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:45:32 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.212s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:32 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Deleted allocations for instance f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb Apr 17 17:45:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-87201653-8f3a-4167-afb4-2e626ed1f47f tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.501s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] Acquiring lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] Lock "f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:33 user nova-compute[71628]: DEBUG nova.compute.manager [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] No waiting events found dispatching network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:45:33 user nova-compute[71628]: WARNING nova.compute.manager [req-bfb5e101-9696-4453-8363-c0a1689b52df req-e4d830bd-499c-48f9-81cd-999488307d2f service nova] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Received unexpected event network-vif-plugged-811d697c-53f4-4f43-9a86-000b1ae7fdba for instance with vm_state deleted and task_state None. Apr 17 17:45:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:35 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:36 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:37 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:45:37 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8820MB free_disk=26.410785675048828GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 18c31df4-f883-4d7b-9ed1-1b99e77eb631 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing inventories for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating ProviderTree inventory for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing aggregate associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, aggregates: None {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing trait associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:45:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.507s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:45:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updating instance_info_cache with network_info: [{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:45:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:45:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:47 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:45:47 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] VM Stopped (Lifecycle Event) Apr 17 17:45:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b588fc1b-4aab-4633-85b3-b4259482456c None None] [instance: f0ab6b84-ef40-4706-a4f1-c8a9d3eb23eb] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:45:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:45:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:03 user nova-compute[71628]: INFO nova.compute.manager [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Rescuing Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquired lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG nova.network.neutron [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG nova.network.neutron [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Releasing lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG nova.compute.manager [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:03 user nova-compute[71628]: WARNING nova.compute.manager [req-bdbdc80e-a603-482e-b2c0-03611d9069ac req-0f1732a7-188f-4447-b20f-ab3570d23138 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state active and task_state rescuing. Apr 17 17:46:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:04 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance destroyed successfully. Apr 17 17:46:04 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Attempting rescue Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71628) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance directory exists: not creating {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 17 17:46:04 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Creating image(s) Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "/opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'trusted_certs' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.147s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue" returned: 0 in 0.064s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.219s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'migration_context' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start _get_guest_xml network_info=[{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "vif_mac": "fa:16:3e:1e:67:b8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue={'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'resources' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'numa_topology' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:04 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'vcpu_model' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2025827501',display_name='tempest-ServerRescueNegativeTestJSON-server-2025827501',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2025827501',id=21,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:44:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-l5cux0tk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:44:20Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=335e8c98-e4f3-4486-8f21-b24096d97d71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "vif_mac": "fa:16:3e:1e:67:b8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "vif_mac": "fa:16:3e:1e:67:b8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.objects.instance [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'pci_devices' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] End _get_guest_xml xml= Apr 17 17:46:04 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:46:04 user nova-compute[71628]: instance-00000015 Apr 17 17:46:04 user nova-compute[71628]: 131072 Apr 17 17:46:04 user nova-compute[71628]: 1 Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-server-2025827501 Apr 17 17:46:04 user nova-compute[71628]: 2023-04-17 17:46:04 Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: 128 Apr 17 17:46:04 user nova-compute[71628]: 1 Apr 17 17:46:04 user nova-compute[71628]: 0 Apr 17 17:46:04 user nova-compute[71628]: 0 Apr 17 17:46:04 user nova-compute[71628]: 1 Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867-project-member Apr 17 17:46:04 user nova-compute[71628]: tempest-ServerRescueNegativeTestJSON-848169867 Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: OpenStack Foundation Apr 17 17:46:04 user nova-compute[71628]: OpenStack Nova Apr 17 17:46:04 user nova-compute[71628]: 0.0.0 Apr 17 17:46:04 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:46:04 user nova-compute[71628]: 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:46:04 user nova-compute[71628]: Virtual Machine Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: hvm Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Nehalem Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: /dev/urandom Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: Apr 17 17:46:04 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:46:04 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance destroyed successfully. Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:46:04 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] No VIF found with MAC fa:16:3e:1e:67:b8, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG nova.compute.manager [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG nova.compute.manager [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:05 user nova-compute[71628]: WARNING nova.compute.manager [req-0ea71a19-5045-4d54-a890-829652b31f84 req-b24692fa-494a-4067-8db4-ef3e792bcfe1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state active and task_state rescuing. Apr 17 17:46:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:05 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:06 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:07 user nova-compute[71628]: WARNING nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state active and task_state rescuing. Apr 17 17:46:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:07 user nova-compute[71628]: DEBUG nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:07 user nova-compute[71628]: WARNING nova.compute.manager [req-227dc60d-bb85-4ccd-8c97-2434829d8ff6 req-b4c99e65-b503-48c6-bf07-d50fff454139 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state active and task_state rescuing. Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.virt.libvirt.host [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Removed pending event for 335e8c98-e4f3-4486-8f21-b24096d97d71 due to event {{(pid=71628) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:46:08 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] VM Resumed (Lifecycle Event) Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-adfa4859-1bc8-4eae-ac45-3df32a779aaa tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:46:08 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:46:08 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] VM Started (Lifecycle Event) Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:08 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:46:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:46:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:46:22 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:46:25 user nova-compute[71628]: INFO nova.compute.claims [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Claim successful on node user Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-changed-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Refreshing instance network info cache due to event network-changed-112d73e4-daf7-4ba3-b282-34913fbb70b1. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] Acquiring lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] Acquired lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.network.neutron [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Refreshing network info cache for port 112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:46:25 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:46:25 user nova-compute[71628]: DEBUG nova.policy [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d29ba758b794e849b8cb94bc76c0247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e5c56c675ef40b8b6eab0d00b46014b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:46:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Creating image(s) Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "/opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "/opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "/opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.part --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.part --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.virt.images [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] cbc835eb-2ba5-4327-a372-311a75051edb was qcow2, converting to raw {{(pid=71628) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.privsep.utils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71628) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.part /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.converted {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Successfully created port: 3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.network.neutron [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updated VIF entry in instance network info cache for port 112d73e4-daf7-4ba3-b282-34913fbb70b1. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG nova.network.neutron [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updating instance_info_cache with network_info: [{"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.157", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a96d8d84-c79e-4e64-be6f-a735081df9fb req-b6d36bf0-cb7b-436a-8cd1-1e08bbe7d6e1 service nova] Releasing lock "refresh_cache-18c31df4-f883-4d7b-9ed1-1b99e77eb631" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.part /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.converted" returned: 0 in 0.272s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.converted --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f.converted --force-share --output=json" returned: 0 in 0.145s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.943s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:26 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json" returned: 0 in 0.171s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json" returned: 0 in 0.127s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f,backing_fmt=raw /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f,backing_fmt=raw /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk 1073741824" returned: 0 in 0.048s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b2b800f9beeb808e7f8ca0e6d20600830bbf6f7f --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Successfully updated port: 3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquired lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:46:27 user nova-compute[71628]: INFO nova.compute.manager [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Terminating instance Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-changed-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Refreshing instance network info cache due to event network-changed-3f1ac3c8-a527-4358-aa08-59734cc43f12. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] Acquiring lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Cannot resize image /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.objects.instance [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'migration_context' on Instance uuid 1d1e6636-11b2-4dc0-8809-232531a4581c {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Ensure instance console log exists: /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Updating instance_info_cache with network_info: [{"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Releasing lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Instance network_info: |[{"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] Acquired lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Refreshing network info cache for port 3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Start _get_guest_xml network_info=[{"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:46:23Z,direct_url=,disk_format='qcow2',id=cbc835eb-2ba5-4327-a372-311a75051edb,min_disk=0,min_ram=0,name='tempest-scenario-img--332404867',owner='3e5c56c675ef40b8b6eab0d00b46014b',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:46:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': 'cbc835eb-2ba5-4327-a372-311a75051edb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:28 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:46:23Z,direct_url=,disk_format='qcow2',id=cbc835eb-2ba5-4327-a372-311a75051edb,min_disk=0,min_ram=0,name='tempest-scenario-img--332404867',owner='3e5c56c675ef40b8b6eab0d00b46014b',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:46:24Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-474499183',display_name='tempest-TestMinimumBasicScenario-server-474499183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-474499183',id=23,image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEO4e3v1HHH2Mmov5RsedbNd3JRISetkDhgZlIA8LOetgCVIAjtKuR9jxTNt4caNTd+h7UG7m0XH4TULzuQ+QQDbzp2myVheOjH8E3fAzaqC4b6Xi/7lnRIo8Rc5HJCoQ==',key_name='tempest-TestMinimumBasicScenario-2100801210',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-omxqfwv4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:46:26Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=1d1e6636-11b2-4dc0-8809-232531a4581c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.objects.instance [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'pci_devices' on Instance uuid 1d1e6636-11b2-4dc0-8809-232531a4581c {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] End _get_guest_xml xml= Apr 17 17:46:28 user nova-compute[71628]: 1d1e6636-11b2-4dc0-8809-232531a4581c Apr 17 17:46:28 user nova-compute[71628]: instance-00000017 Apr 17 17:46:28 user nova-compute[71628]: 131072 Apr 17 17:46:28 user nova-compute[71628]: 1 Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: tempest-TestMinimumBasicScenario-server-474499183 Apr 17 17:46:28 user nova-compute[71628]: 2023-04-17 17:46:28 Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: 128 Apr 17 17:46:28 user nova-compute[71628]: 1 Apr 17 17:46:28 user nova-compute[71628]: 0 Apr 17 17:46:28 user nova-compute[71628]: 0 Apr 17 17:46:28 user nova-compute[71628]: 1 Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: tempest-TestMinimumBasicScenario-145353383-project-member Apr 17 17:46:28 user nova-compute[71628]: tempest-TestMinimumBasicScenario-145353383 Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: OpenStack Foundation Apr 17 17:46:28 user nova-compute[71628]: OpenStack Nova Apr 17 17:46:28 user nova-compute[71628]: 0.0.0 Apr 17 17:46:28 user nova-compute[71628]: 1d1e6636-11b2-4dc0-8809-232531a4581c Apr 17 17:46:28 user nova-compute[71628]: 1d1e6636-11b2-4dc0-8809-232531a4581c Apr 17 17:46:28 user nova-compute[71628]: Virtual Machine Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: hvm Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Nehalem Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: /dev/urandom Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: Apr 17 17:46:28 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-474499183',display_name='tempest-TestMinimumBasicScenario-server-474499183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-474499183',id=23,image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEO4e3v1HHH2Mmov5RsedbNd3JRISetkDhgZlIA8LOetgCVIAjtKuR9jxTNt4caNTd+h7UG7m0XH4TULzuQ+QQDbzp2myVheOjH8E3fAzaqC4b6Xi/7lnRIo8Rc5HJCoQ==',key_name='tempest-TestMinimumBasicScenario-2100801210',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-omxqfwv4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:46:26Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=1d1e6636-11b2-4dc0-8809-232531a4581c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG os_vif [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:46:28 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Instance destroyed successfully. Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.objects.instance [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'resources' on Instance uuid 18c31df4-f883-4d7b-9ed1-1b99e77eb631 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f1ac3c8-a5, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3f1ac3c8-a5, col_values=(('external_ids', {'iface-id': '3f1ac3c8-a527-4358-aa08-59734cc43f12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:3d:80', 'vm-uuid': '1d1e6636-11b2-4dc0-8809-232531a4581c'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-840531982',display_name='tempest-AttachVolumeTestJSON-server-840531982',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-840531982',id=22,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE9znVyh2XvVgB/lD9Kekstd6of4EWCi4jOx1ZRC3TdN6cXmpezDHHLnBhMDPrdXw4ZtdwembcGb/X4HTEO4jnGIfp0e3mZJuw1pVWzt7I7Vt1a1IH8RGm/LXG+ZjoqIJg==',key_name='tempest-keypair-1415434273',keypairs=,launch_index=0,launched_at=2023-04-17T17:44:41Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-xmyeswa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:44:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=18c31df4-f883-4d7b-9ed1-1b99e77eb631,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.157", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "address": "fa:16:3e:e8:9c:9e", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.157", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap112d73e4-da", "ovs_interfaceid": "112d73e4-daf7-4ba3-b282-34913fbb70b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG os_vif [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: INFO os_vif [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap112d73e4-da, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: INFO os_vif [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=112d73e4-daf7-4ba3-b282-34913fbb70b1,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap112d73e4-da') Apr 17 17:46:28 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Deleting instance files /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631_del Apr 17 17:46:28 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Deletion of /opt/stack/data/nova/instances/18c31df4-f883-4d7b-9ed1-1b99e77eb631_del complete Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] No VIF found with MAC fa:16:3e:12:3d:80, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:46:28 user nova-compute[71628]: INFO nova.compute.manager [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Took 0.71 seconds to destroy the instance on the hypervisor. Apr 17 17:46:28 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Updated VIF entry in instance network info cache for port 3f1ac3c8-a527-4358-aa08-59734cc43f12. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Updating instance_info_cache with network_info: [{"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0cb39ee4-273c-4d2d-843a-44c2bd23fd67 req-edfca35a-250b-446d-8254-53019605a7e0 service nova] Releasing lock "refresh_cache-1d1e6636-11b2-4dc0-8809-232531a4581c" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:28 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:29 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Took 0.70 seconds to deallocate network for instance. Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-6f332d75-f044-4d22-9c32-84f0eaca99f4 req-38f50177-f5a9-4cd4-9676-dda06fd40982 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-vif-deleted-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:29 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Deleted allocations for instance 18c31df4-f883-4d7b-9ed1-1b99e77eb631 Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcfd7584-2e3d-46a4-a7a1-799b21c3d93e tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.779s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-vif-unplugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] No waiting events found dispatching network-vif-unplugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:29 user nova-compute[71628]: WARNING nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received unexpected event network-vif-unplugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 for instance with vm_state deleted and task_state None. Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Acquiring lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] Lock "18c31df4-f883-4d7b-9ed1-1b99e77eb631-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:29 user nova-compute[71628]: DEBUG nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] No waiting events found dispatching network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:29 user nova-compute[71628]: WARNING nova.compute.manager [req-51ef36f9-e40f-467f-b819-4d4bfd5dd50e req-3a2df60b-41d0-4802-aeff-b0971aa62e67 service nova] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Received unexpected event network-vif-plugged-112d73e4-daf7-4ba3-b282-34913fbb70b1 for instance with vm_state deleted and task_state None. Apr 17 17:46:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:30 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] No waiting events found dispatching network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:31 user nova-compute[71628]: WARNING nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received unexpected event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 for instance with vm_state building and task_state spawning. Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] No waiting events found dispatching network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:46:31 user nova-compute[71628]: WARNING nova.compute.manager [req-1ed13fc5-3fcc-45b2-a47b-755c175062f0 req-6dafa277-75b5-469a-866c-59a0a093b1aa service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received unexpected event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 for instance with vm_state building and task_state spawning. Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] VM Resumed (Lifecycle Event) Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Instance spawned successfully. Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] VM Started (Lifecycle Event) Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Took 5.78 seconds to spawn the instance on the hypervisor. Apr 17 17:46:31 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:31 user nova-compute[71628]: INFO nova.compute.manager [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Took 6.37 seconds to build instance. Apr 17 17:46:31 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-f8f602ec-8a6a-41d1-939d-665eaa290b0a tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.527s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:35 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:37 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:46:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:38 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8737MB free_disk=26.392860412597656GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 1d1e6636-11b2-4dc0-8809-232531a4581c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:46:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:46:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:46:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:42 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:46:43 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:46:43 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] VM Stopped (Lifecycle Event) Apr 17 17:46:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-3b480eb3-8261-4468-9f1e-995c368ad4a7 None None] [instance: 18c31df4-f883-4d7b-9ed1-1b99e77eb631] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:46:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:46:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:52 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:46:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:47:21 user nova-compute[71628]: INFO nova.compute.claims [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Claim successful on node user Apr 17 17:47:21 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:47:22 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.policy [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3d4aee23bae743f19bdf6f991e044587', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cd3bfc1c102a47ff9efab5cb9a78021e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:47:22 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Creating image(s) Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "/opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "/opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "/opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.141s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk 1073741824" returned: 0 in 0.047s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.194s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Successfully created port: 482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Cannot resize image /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.objects.instance [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'migration_context' on Instance uuid 6b3b32af-2f00-44f3-8287-9ff8924e6db7 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Ensure instance console log exists: /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:22 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Successfully updated port: 482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquired lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-changed-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Refreshing instance network info cache due to event network-changed-482a7a99-8edf-4f93-a747-ad53fa2779b6. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] Acquiring lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updating instance_info_cache with network_info: [{"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Releasing lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Instance network_info: |[{"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] Acquired lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.neutron [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Refreshing network info cache for port 482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Start _get_guest_xml network_info=[{"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:47:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:47:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:47:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-859460792',display_name='tempest-AttachVolumeTestJSON-server-859460792',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-859460792',id=24,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJXTTiCCqoI0ZpbbGtiXNKugGYNLQVWGPKnospMR4za+DDU14IWv2r42pnKWdekiYYIuhfuRknSTBJe5tCEpMgXVtipIRysFAAA/08IDGr3VRsHCftgYb1Igz/Cq6OUfw==',key_name='tempest-keypair-396176061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-np74jxa8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:47:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=6b3b32af-2f00-44f3-8287-9ff8924e6db7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.objects.instance [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'pci_devices' on Instance uuid 6b3b32af-2f00-44f3-8287-9ff8924e6db7 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] End _get_guest_xml xml= Apr 17 17:47:23 user nova-compute[71628]: 6b3b32af-2f00-44f3-8287-9ff8924e6db7 Apr 17 17:47:23 user nova-compute[71628]: instance-00000018 Apr 17 17:47:23 user nova-compute[71628]: 131072 Apr 17 17:47:23 user nova-compute[71628]: 1 Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: tempest-AttachVolumeTestJSON-server-859460792 Apr 17 17:47:23 user nova-compute[71628]: 2023-04-17 17:47:23 Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: 128 Apr 17 17:47:23 user nova-compute[71628]: 1 Apr 17 17:47:23 user nova-compute[71628]: 0 Apr 17 17:47:23 user nova-compute[71628]: 0 Apr 17 17:47:23 user nova-compute[71628]: 1 Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: tempest-AttachVolumeTestJSON-2102743292-project-member Apr 17 17:47:23 user nova-compute[71628]: tempest-AttachVolumeTestJSON-2102743292 Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: OpenStack Foundation Apr 17 17:47:23 user nova-compute[71628]: OpenStack Nova Apr 17 17:47:23 user nova-compute[71628]: 0.0.0 Apr 17 17:47:23 user nova-compute[71628]: 6b3b32af-2f00-44f3-8287-9ff8924e6db7 Apr 17 17:47:23 user nova-compute[71628]: 6b3b32af-2f00-44f3-8287-9ff8924e6db7 Apr 17 17:47:23 user nova-compute[71628]: Virtual Machine Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: hvm Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Nehalem Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: /dev/urandom Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: Apr 17 17:47:23 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:47:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-859460792',display_name='tempest-AttachVolumeTestJSON-server-859460792',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-859460792',id=24,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJXTTiCCqoI0ZpbbGtiXNKugGYNLQVWGPKnospMR4za+DDU14IWv2r42pnKWdekiYYIuhfuRknSTBJe5tCEpMgXVtipIRysFAAA/08IDGr3VRsHCftgYb1Igz/Cq6OUfw==',key_name='tempest-keypair-396176061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-np74jxa8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:47:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=6b3b32af-2f00-44f3-8287-9ff8924e6db7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG os_vif [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap482a7a99-8e, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap482a7a99-8e, col_values=(('external_ids', {'iface-id': '482a7a99-8edf-4f93-a747-ad53fa2779b6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:5e:2f', 'vm-uuid': '6b3b32af-2f00-44f3-8287-9ff8924e6db7'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:47:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:24 user nova-compute[71628]: INFO os_vif [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') Apr 17 17:47:24 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:47:24 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] No VIF found with MAC fa:16:3e:8a:5e:2f, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:47:24 user nova-compute[71628]: DEBUG nova.network.neutron [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updated VIF entry in instance network info cache for port 482a7a99-8edf-4f93-a747-ad53fa2779b6. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:47:24 user nova-compute[71628]: DEBUG nova.network.neutron [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updating instance_info_cache with network_info: [{"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:47:24 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-df477676-cc8a-46c2-8846-ab750aa43d3b req-57171b18-42bb-433e-82c4-152da5d4db8f service nova] Releasing lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] No waiting events found dispatching network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:47:25 user nova-compute[71628]: WARNING nova.compute.manager [req-700bcce9-2308-4a00-89ad-776785c85007 req-24d9b8e6-a1e9-4e89-bed3-2ef187bb17d9 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received unexpected event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 for instance with vm_state building and task_state spawning. Apr 17 17:47:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:25 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] VM Resumed (Lifecycle Event) Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Instance spawned successfully. Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] VM Started (Lifecycle Event) Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Took 5.35 seconds to spawn the instance on the hypervisor. Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:47:27 user nova-compute[71628]: INFO nova.compute.manager [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Took 6.40 seconds to build instance. Apr 17 17:47:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-e6bb721a-6084-45d5-a086-8ef025f0215a tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.497s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] No waiting events found dispatching network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:47:27 user nova-compute[71628]: WARNING nova.compute.manager [req-4aad2ff3-6cf7-4329-987d-b30f0292ad16 req-c6027c88-37ae-4b7a-8186-48849391e74f service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received unexpected event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 for instance with vm_state active and task_state None. Apr 17 17:47:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:32 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json" returned: 0 in 0.275s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.151s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:47:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:47:40 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:47:40 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8679MB free_disk=26.37106704711914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 1d1e6636-11b2-4dc0-8809-232531a4581c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 6b3b32af-2f00-44f3-8287-9ff8924e6db7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:47:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:47:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid 34582c99-56bf-44e5-adca-a9883318afa0 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updating instance_info_cache with network_info: [{"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-34582c99-56bf-44e5-adca-a9883318afa0" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:47:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:47:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:49 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:47:54 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:47:59 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:48:04 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:48:09 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:17 user nova-compute[71628]: INFO nova.compute.manager [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Terminating instance Apr 17 17:48:17 user nova-compute[71628]: DEBUG nova.compute.manager [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-unplugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] No waiting events found dispatching network-vif-unplugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:48:17 user nova-compute[71628]: DEBUG nova.compute.manager [req-d10a7cb6-833d-4913-9615-1d03178af24c req-6cfabcff-60a8-4caf-9543-7d037c15bb43 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-unplugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Instance destroyed successfully. Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.objects.instance [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lazy-loading 'resources' on Instance uuid 1d1e6636-11b2-4dc0-8809-232531a4581c {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-474499183',display_name='tempest-TestMinimumBasicScenario-server-474499183',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-474499183',id=23,image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOEO4e3v1HHH2Mmov5RsedbNd3JRISetkDhgZlIA8LOetgCVIAjtKuR9jxTNt4caNTd+h7UG7m0XH4TULzuQ+QQDbzp2myVheOjH8E3fAzaqC4b6Xi/7lnRIo8Rc5HJCoQ==',key_name='tempest-TestMinimumBasicScenario-2100801210',keypairs=,launch_index=0,launched_at=2023-04-17T17:46:31Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3e5c56c675ef40b8b6eab0d00b46014b',ramdisk_id='',reservation_id='r-omxqfwv4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='cbc835eb-2ba5-4327-a372-311a75051edb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-145353383',owner_user_name='tempest-TestMinimumBasicScenario-145353383-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:46:32Z,user_data=None,user_id='7d29ba758b794e849b8cb94bc76c0247',uuid=1d1e6636-11b2-4dc0-8809-232531a4581c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converting VIF {"id": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "address": "fa:16:3e:12:3d:80", "network": {"id": "961563e7-f0ae-4972-8b45-18610039d6a4", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-686199779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3e5c56c675ef40b8b6eab0d00b46014b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f1ac3c8-a5", "ovs_interfaceid": "3f1ac3c8-a527-4358-aa08-59734cc43f12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG os_vif [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f1ac3c8-a5, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:18 user nova-compute[71628]: INFO os_vif [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:3d:80,bridge_name='br-int',has_traffic_filtering=True,id=3f1ac3c8-a527-4358-aa08-59734cc43f12,network=Network(961563e7-f0ae-4972-8b45-18610039d6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f1ac3c8-a5') Apr 17 17:48:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Deleting instance files /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c_del Apr 17 17:48:18 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Deletion of /opt/stack/data/nova/instances/1d1e6636-11b2-4dc0-8809-232531a4581c_del complete Apr 17 17:48:18 user nova-compute[71628]: INFO nova.compute.manager [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 17 17:48:18 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:48:18 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Took 0.44 seconds to deallocate network for instance. Apr 17 17:48:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:48:18 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.181s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:19 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Deleted allocations for instance 1d1e6636-11b2-4dc0-8809-232531a4581c Apr 17 17:48:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-988f213a-767b-423a-be4e-e2161d6c4231 tempest-TestMinimumBasicScenario-145353383 tempest-TestMinimumBasicScenario-145353383-project-member] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.480s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] Acquiring lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] Lock "1d1e6636-11b2-4dc0-8809-232531a4581c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] No waiting events found dispatching network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:48:19 user nova-compute[71628]: WARNING nova.compute.manager [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received unexpected event network-vif-plugged-3f1ac3c8-a527-4358-aa08-59734cc43f12 for instance with vm_state deleted and task_state None. Apr 17 17:48:19 user nova-compute[71628]: DEBUG nova.compute.manager [req-0a9b0dee-0a45-4323-b989-fdf36f5450ee req-08f0ec0c-f2cf-4fdc-90ec-dbb8778d12e5 service nova] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Received event network-vif-deleted-3f1ac3c8-a527-4358-aa08-59734cc43f12 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:33 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:48:33 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] VM Stopped (Lifecycle Event) Apr 17 17:48:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b5f67204-2cb3-4fe0-bf4c-9c024ada143f None None] [instance: 1d1e6636-11b2-4dc0-8809-232531a4581c] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:48:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:36 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:36 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:48:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:38 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk.rescue --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:48:39 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:48:39 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8833MB free_disk=26.37035369873047GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 6b3b32af-2f00-44f3-8287-9ff8924e6db7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:48:39 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.266s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [{"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-335e8c98-e4f3-4486-8f21-b24096d97d71" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:48:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:48:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:56 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:56 user nova-compute[71628]: INFO nova.compute.manager [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Terminating instance Apr 17 17:48:56 user nova-compute[71628]: DEBUG nova.compute.manager [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.compute.manager [req-ef31d1de-3deb-4cee-bcfb-2a22f4d75008 req-fd571951-52a4-4cfb-9d77-1272c156b0f7 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-unplugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Instance destroyed successfully. Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.objects.instance [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'resources' on Instance uuid 335e8c98-e4f3-4486-8f21-b24096d97d71 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2025827501',display_name='tempest-ServerRescueNegativeTestJSON-server-2025827501',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2025827501',id=21,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:46:08Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-l5cux0tk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:46:08Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=335e8c98-e4f3-4486-8f21-b24096d97d71,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "address": "fa:16:3e:1e:67:b8", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb064deb8-b9", "ovs_interfaceid": "b064deb8-b9d4-483a-9f33-beb3dbfd48af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG os_vif [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb064deb8-b9, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:48:57 user nova-compute[71628]: INFO os_vif [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:67:b8,bridge_name='br-int',has_traffic_filtering=True,id=b064deb8-b9d4-483a-9f33-beb3dbfd48af,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb064deb8-b9') Apr 17 17:48:57 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Deleting instance files /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71_del Apr 17 17:48:57 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Deletion of /opt/stack/data/nova/instances/335e8c98-e4f3-4486-8f21-b24096d97d71_del complete Apr 17 17:48:57 user nova-compute[71628]: INFO nova.compute.manager [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Took 0.70 seconds to destroy the instance on the hypervisor. Apr 17 17:48:57 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:48:57 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:48:58 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Took 0.50 seconds to deallocate network for instance. Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.compute.manager [req-25cc3679-7ef4-4174-aa10-a0d390444efd req-9fbb3eec-7bf6-4de8-ad93-0c54d93dbd90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-deleted-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:58 user nova-compute[71628]: INFO nova.compute.manager [req-25cc3679-7ef4-4174-aa10-a0d390444efd req-9fbb3eec-7bf6-4de8-ad93-0c54d93dbd90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Neutron deleted interface b064deb8-b9d4-483a-9f33-beb3dbfd48af; detaching it from the instance and deleting it from the info cache Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.network.neutron [req-25cc3679-7ef4-4174-aa10-a0d390444efd req-9fbb3eec-7bf6-4de8-ad93-0c54d93dbd90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.compute.manager [req-25cc3679-7ef4-4174-aa10-a0d390444efd req-9fbb3eec-7bf6-4de8-ad93-0c54d93dbd90 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Detach interface failed, port_id=b064deb8-b9d4-483a-9f33-beb3dbfd48af, reason: Instance 335e8c98-e4f3-4486-8f21-b24096d97d71 could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:48:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:58 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Deleted allocations for instance 335e8c98-e4f3-4486-8f21-b24096d97d71 Apr 17 17:48:58 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-aa97c523-0ebe-4164-9590-3e4a9c71281c tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.533s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:59 user nova-compute[71628]: DEBUG nova.compute.manager [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:48:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] Acquiring lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:48:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:48:59 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] Lock "335e8c98-e4f3-4486-8f21-b24096d97d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:48:59 user nova-compute[71628]: DEBUG nova.compute.manager [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] No waiting events found dispatching network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:48:59 user nova-compute[71628]: WARNING nova.compute.manager [req-5c6ef334-9331-419c-90d2-9468bb4b33d8 req-6f6327db-df0f-4c67-8272-ce295a661ff1 service nova] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Received unexpected event network-vif-plugged-b064deb8-b9d4-483a-9f33-beb3dbfd48af for instance with vm_state deleted and task_state None. Apr 17 17:49:02 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:10 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-changed-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.compute.manager [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Refreshing instance network info cache due to event network-changed-482a7a99-8edf-4f93-a747-ad53fa2779b6. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] Acquiring lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] Acquired lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.network.neutron [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Refreshing network info cache for port 482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.network.neutron [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updated VIF entry in instance network info cache for port 482a7a99-8edf-4f93-a747-ad53fa2779b6. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.network.neutron [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updating instance_info_cache with network_info: [{"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.78", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-3a123d66-2ca5-4048-97de-fc9012d97830 req-949e3418-cd87-4160-a9ee-41810ae41613 service nova] Releasing lock "refresh_cache-6b3b32af-2f00-44f3-8287-9ff8924e6db7" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:49:12 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] VM Stopped (Lifecycle Event) Apr 17 17:49:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-52dce80a-9fcc-446d-a5a3-0f25c76ceae0 None None] [instance: 335e8c98-e4f3-4486-8f21-b24096d97d71] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:13 user nova-compute[71628]: INFO nova.compute.manager [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Terminating instance Apr 17 17:49:13 user nova-compute[71628]: DEBUG nova.compute.manager [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-unplugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] No waiting events found dispatching network-vif-unplugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:49:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-9777dfe4-98f0-49bb-a7ef-707ec6623bc8 req-4a23ae9b-92db-48a9-8985-0b5f7dd48318 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-unplugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:49:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Instance destroyed successfully. Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.objects.instance [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lazy-loading 'resources' on Instance uuid 6b3b32af-2f00-44f3-8287-9ff8924e6db7 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:47:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-859460792',display_name='tempest-AttachVolumeTestJSON-server-859460792',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-859460792',id=24,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJJXTTiCCqoI0ZpbbGtiXNKugGYNLQVWGPKnospMR4za+DDU14IWv2r42pnKWdekiYYIuhfuRknSTBJe5tCEpMgXVtipIRysFAAA/08IDGr3VRsHCftgYb1Igz/Cq6OUfw==',key_name='tempest-keypair-396176061',keypairs=,launch_index=0,launched_at=2023-04-17T17:47:27Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cd3bfc1c102a47ff9efab5cb9a78021e',ramdisk_id='',reservation_id='r-np74jxa8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-2102743292',owner_user_name='tempest-AttachVolumeTestJSON-2102743292-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:47:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3d4aee23bae743f19bdf6f991e044587',uuid=6b3b32af-2f00-44f3-8287-9ff8924e6db7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.78", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converting VIF {"id": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "address": "fa:16:3e:8a:5e:2f", "network": {"id": "fbf3ce34-ee1a-433a-89ae-35e198d262a3", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2124537068-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.78", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cd3bfc1c102a47ff9efab5cb9a78021e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap482a7a99-8e", "ovs_interfaceid": "482a7a99-8edf-4f93-a747-ad53fa2779b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG os_vif [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap482a7a99-8e, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:14 user nova-compute[71628]: INFO os_vif [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:5e:2f,bridge_name='br-int',has_traffic_filtering=True,id=482a7a99-8edf-4f93-a747-ad53fa2779b6,network=Network(fbf3ce34-ee1a-433a-89ae-35e198d262a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap482a7a99-8e') Apr 17 17:49:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Deleting instance files /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7_del Apr 17 17:49:14 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Deletion of /opt/stack/data/nova/instances/6b3b32af-2f00-44f3-8287-9ff8924e6db7_del complete Apr 17 17:49:14 user nova-compute[71628]: INFO nova.compute.manager [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 17 17:49:14 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:49:14 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:49:15 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Took 0.93 seconds to deallocate network for instance. Apr 17 17:49:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-d829cde4-679c-411e-a086-d40796619e7d req-d13e1d8b-659b-4197-8b72-9ec11db74fe1 service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-deleted-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:49:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:15 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Deleted allocations for instance 6b3b32af-2f00-44f3-8287-9ff8924e6db7 Apr 17 17:49:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-11f70476-1a20-4585-a657-79743c99a000 tempest-AttachVolumeTestJSON-2102743292 tempest-AttachVolumeTestJSON-2102743292-project-member] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.945s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] Acquiring lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:16 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] Lock "6b3b32af-2f00-44f3-8287-9ff8924e6db7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:16 user nova-compute[71628]: DEBUG nova.compute.manager [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] No waiting events found dispatching network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:49:16 user nova-compute[71628]: WARNING nova.compute.manager [req-b7740442-89f0-4de2-b731-670259a3262d req-6e59426f-83d4-46e8-bcdd-430f1b6dcf6d service nova] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Received unexpected event network-vif-plugged-482a7a99-8edf-4f93-a747-ad53fa2779b6 for instance with vm_state deleted and task_state None. Apr 17 17:49:17 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:19 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:49:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:29 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:49:29 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] VM Stopped (Lifecycle Event) Apr 17 17:49:29 user nova-compute[71628]: DEBUG nova.compute.manager [None req-192dcc00-fbe0-4b8c-9d8a-9a8a4ae31cf1 None None] [instance: 6b3b32af-2f00-44f3-8287-9ff8924e6db7] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:49:29 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:33 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 17:49:33 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] There are 0 instances to clean {{(pid=71628) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 17:49:34 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:37 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:38 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:38 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:49:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:39 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:49:40 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:49:40 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=9033MB free_disk=26.456920623779297GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 34582c99-56bf-44e5-adca-a9883318afa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:49:40 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:49:41 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:49:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:43 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:44 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:49:44 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:49:44 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:45 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:45 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:45 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Cleaning up deleted instances with incomplete migration {{(pid=71628) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:47 user nova-compute[71628]: INFO nova.compute.manager [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Terminating instance Apr 17 17:49:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-unplugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] No waiting events found dispatching network-vif-unplugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.compute.manager [req-ab53dd94-e0b7-4a97-96cc-667af66363ee req-f20f446e-bfbf-47b5-b5c3-e34a594b50c5 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-unplugged-558be61b-7179-45ab-9796-160aa6bb3e86 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:49:48 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Instance destroyed successfully. Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.objects.instance [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lazy-loading 'resources' on Instance uuid 34582c99-56bf-44e5-adca-a9883318afa0 {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:44:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2116812969',display_name='tempest-ServerRescueNegativeTestJSON-server-2116812969',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2116812969',id=20,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:44:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c9cdf67684764421af28a1cd43efcf0b',ramdisk_id='',reservation_id='r-gp7vg9vz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-848169867',owner_user_name='tempest-ServerRescueNegativeTestJSON-848169867-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:44:19Z,user_data=None,user_id='8d22aee4776b4ae89ca19af5ce976d18',uuid=34582c99-56bf-44e5-adca-a9883318afa0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converting VIF {"id": "558be61b-7179-45ab-9796-160aa6bb3e86", "address": "fa:16:3e:08:fa:96", "network": {"id": "fd8c8bf4-7a16-4afe-b04d-99b82336f56d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1857593591-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c9cdf67684764421af28a1cd43efcf0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap558be61b-71", "ovs_interfaceid": "558be61b-7179-45ab-9796-160aa6bb3e86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG os_vif [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap558be61b-71, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:48 user nova-compute[71628]: INFO os_vif [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:fa:96,bridge_name='br-int',has_traffic_filtering=True,id=558be61b-7179-45ab-9796-160aa6bb3e86,network=Network(fd8c8bf4-7a16-4afe-b04d-99b82336f56d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap558be61b-71') Apr 17 17:49:48 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Deleting instance files /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0_del Apr 17 17:49:48 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Deletion of /opt/stack/data/nova/instances/34582c99-56bf-44e5-adca-a9883318afa0_del complete Apr 17 17:49:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:48 user nova-compute[71628]: INFO nova.compute.manager [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:49:48 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Took 0.46 seconds to deallocate network for instance. Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:49:48 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:49:49 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:49 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Deleted allocations for instance 34582c99-56bf-44e5-adca-a9883318afa0 Apr 17 17:49:49 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-ecb845b0-6e24-4ff4-9dd8-166384d272d4 tempest-ServerRescueNegativeTestJSON-848169867 tempest-ServerRescueNegativeTestJSON-848169867-project-member] Lock "34582c99-56bf-44e5-adca-a9883318afa0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.446s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] Acquiring lock "34582c99-56bf-44e5-adca-a9883318afa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:49:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:49:50 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] Lock "34582c99-56bf-44e5-adca-a9883318afa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:49:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] No waiting events found dispatching network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:49:50 user nova-compute[71628]: WARNING nova.compute.manager [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received unexpected event network-vif-plugged-558be61b-7179-45ab-9796-160aa6bb3e86 for instance with vm_state deleted and task_state None. Apr 17 17:49:50 user nova-compute[71628]: DEBUG nova.compute.manager [req-517e19e3-0fa5-4fea-9ba0-83a9c4c2f8d1 req-f16c0b82-1c04-4f10-b19d-b74de2f5a025 service nova] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Received event network-vif-deleted-558be61b-7179-45ab-9796-160aa6bb3e86 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:49:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:56 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:57 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:49:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:50:03 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] VM Stopped (Lifecycle Event) Apr 17 17:50:03 user nova-compute[71628]: DEBUG nova.compute.manager [None req-00b73312-1b58-4f56-b3c4-24b23ce3b982 None None] [instance: 34582c99-56bf-44e5-adca-a9883318afa0] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:50:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:07 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Starting instance... {{(pid=71628) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:20 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71628) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 17:50:20 user nova-compute[71628]: INFO nova.compute.claims [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Claim successful on node user Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Start building networks asynchronously for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Allocating IP information in the background. {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] allocate_for_instance() {{(pid=71628) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 17:50:21 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Start building block device mappings for instance. {{(pid=71628) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Start spawning the instance on the hypervisor. {{(pid=71628) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Creating instance directory {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 17:50:21 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Creating image(s) Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "/opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "/opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "/opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.128s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.125s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk 1073741824 {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062,backing_fmt=raw /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk 1073741824" returned: 0 in 0.046s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "d707d9baa21ef9cbafe179e13cf40c2bff580062" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.179s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/d707d9baa21ef9cbafe179e13cf40c2bff580062 --force-share --output=json" returned: 0 in 0.129s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Checking if we can resize image /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk. size=1073741824 {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.policy [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3999f6aff774419daca6c25a18ee9af7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d9fa2be78c04fdeafdacb7af1c7ef80', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71628) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.virt.disk.api [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Cannot resize image /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk to a smaller size. {{(pid=71628) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.objects.instance [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lazy-loading 'migration_context' on Instance uuid 46e92122-765c-4ff1-9c13-66510c1221dc {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Created local disks {{(pid=71628) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Ensure instance console log exists: /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/console.log {{(pid=71628) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:21 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:22 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Successfully created port: fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Successfully updated port: fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquired lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Building network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-changed-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.compute.manager [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Refreshing instance network info cache due to event network-changed-fba079fc-43ec-4d3b-b81a-e581a865a4f6. {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] Acquiring lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Instance cache missing network info. {{(pid=71628) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Releasing lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Instance network_info: |[{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71628) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] Acquired lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Refreshing network info cache for port fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Start _get_guest_xml network_info=[{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'encrypted': False, 'image_id': '82e42adf-a9f9-4d9b-9bd0-106a738b1690'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 17:50:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:50:23 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71628) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T17:30:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T17:29:49Z,direct_url=,disk_format='qcow2',id=82e42adf-a9f9-4d9b-9bd0-106a738b1690,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='e7c2c9ffef414a0fa710b03d426da849',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T17:29:51Z,virtual_size=,visibility=), allow threads: True {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Flavor limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Image limits 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Flavor pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Image pref 0:0:0 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71628) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Got 1 possible topologies {{(pid=71628) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.hardware [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71628) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-519140730',display_name='tempest-VolumesActionsTest-instance-519140730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-519140730',id=25,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d9fa2be78c04fdeafdacb7af1c7ef80',ramdisk_id='',reservation_id='r-16u2qn97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1894276307',owner_user_name='tempest-VolumesActionsTest-1894276307-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:50:21Z,user_data=None,user_id='3999f6aff774419daca6c25a18ee9af7',uuid=46e92122-765c-4ff1-9c13-66510c1221dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71628) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converting VIF {"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.objects.instance [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lazy-loading 'pci_devices' on Instance uuid 46e92122-765c-4ff1-9c13-66510c1221dc {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] End _get_guest_xml xml= Apr 17 17:50:23 user nova-compute[71628]: 46e92122-765c-4ff1-9c13-66510c1221dc Apr 17 17:50:23 user nova-compute[71628]: instance-00000019 Apr 17 17:50:23 user nova-compute[71628]: 131072 Apr 17 17:50:23 user nova-compute[71628]: 1 Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: tempest-VolumesActionsTest-instance-519140730 Apr 17 17:50:23 user nova-compute[71628]: 2023-04-17 17:50:23 Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: 128 Apr 17 17:50:23 user nova-compute[71628]: 1 Apr 17 17:50:23 user nova-compute[71628]: 0 Apr 17 17:50:23 user nova-compute[71628]: 0 Apr 17 17:50:23 user nova-compute[71628]: 1 Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: tempest-VolumesActionsTest-1894276307-project-member Apr 17 17:50:23 user nova-compute[71628]: tempest-VolumesActionsTest-1894276307 Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: OpenStack Foundation Apr 17 17:50:23 user nova-compute[71628]: OpenStack Nova Apr 17 17:50:23 user nova-compute[71628]: 0.0.0 Apr 17 17:50:23 user nova-compute[71628]: 46e92122-765c-4ff1-9c13-66510c1221dc Apr 17 17:50:23 user nova-compute[71628]: 46e92122-765c-4ff1-9c13-66510c1221dc Apr 17 17:50:23 user nova-compute[71628]: Virtual Machine Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: hvm Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Nehalem Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: /dev/urandom Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: Apr 17 17:50:23 user nova-compute[71628]: {{(pid=71628) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-519140730',display_name='tempest-VolumesActionsTest-instance-519140730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-519140730',id=25,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d9fa2be78c04fdeafdacb7af1c7ef80',ramdisk_id='',reservation_id='r-16u2qn97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1894276307',owner_user_name='tempest-VolumesActionsTest-1894276307-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T17:50:21Z,user_data=None,user_id='3999f6aff774419daca6c25a18ee9af7',uuid=46e92122-765c-4ff1-9c13-66510c1221dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converting VIF {"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG os_vif [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') {{(pid=71628) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba079fc-43, may_exist=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfba079fc-43, col_values=(('external_ids', {'iface-id': 'fba079fc-43ec-4d3b-b81a-e581a865a4f6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:b9:e8', 'vm-uuid': '46e92122-765c-4ff1-9c13-66510c1221dc'}),)) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:23 user nova-compute[71628]: INFO os_vif [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] No BDM found with device name vda, not building metadata. {{(pid=71628) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] No VIF found with MAC fa:16:3e:f2:b9:e8, not building metadata {{(pid=71628) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updated VIF entry in instance network info cache for port fba079fc-43ec-4d3b-b81a-e581a865a4f6. {{(pid=71628) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG nova.network.neutron [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:50:23 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-a89dfc68-7060-4d23-b629-026748b52e56 req-a4c52922-34fa-40a6-ae7b-476105eab1ae service nova] Releasing lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:50:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:24 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:50:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:25 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:25 user nova-compute[71628]: DEBUG nova.compute.manager [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] No waiting events found dispatching network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:50:25 user nova-compute[71628]: WARNING nova.compute.manager [req-d2e2bddf-b631-47b2-8663-b83e3f8a22d1 req-78b06aeb-1175-4142-88d3-32372926f272 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received unexpected event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 for instance with vm_state building and task_state spawning. Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Resumed> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:50:26 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] VM Resumed (Lifecycle Event) Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Instance event wait completed in 0 seconds for {{(pid=71628) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Guest created on hypervisor {{(pid=71628) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 17:50:26 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Instance spawned successfully. Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:50:26 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_cdrom_bus of ide {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_disk_bus of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_input_bus of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_pointer_model of None {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_video_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.libvirt.driver [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Found default for hw_vif_model of virtio {{(pid=71628) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 17:50:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.virt.driver [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] Emitting event Started> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:50:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] VM Started (Lifecycle Event) Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71628) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 17:50:27 user nova-compute[71628]: INFO nova.compute.manager [None req-f40108fe-88d6-4c8a-8bef-2ac91a1cd638 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 17:50:27 user nova-compute[71628]: INFO nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Took 5.74 seconds to spawn the instance on the hypervisor. Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:50:27 user nova-compute[71628]: INFO nova.compute.manager [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Took 6.24 seconds to build instance. Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG nova.compute.manager [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] No waiting events found dispatching network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:50:27 user nova-compute[71628]: WARNING nova.compute.manager [req-d478d065-92a8-46e4-969a-e8d54b7a2f17 req-90f5b890-ad20-453d-94d6-b8ecd194d346 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received unexpected event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 for instance with vm_state active and task_state None. Apr 17 17:50:27 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-28da9c80-2cd5-48d5-84db-c44bc0ba3928 tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.339s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:27 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:50:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:50:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:40 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:50:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:50:41 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:50:41 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=8986MB free_disk=26.447296142578125GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:50:41 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 46e92122-765c-4ff1-9c13-66510c1221dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing inventories for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating ProviderTree inventory for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Updating inventory in ProviderTree for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing aggregate associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, aggregates: None {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Refreshing trait associations for resource provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_MMX,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VMVGA,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE {{(pid=71628) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:50:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid 46e92122-765c-4ff1-9c13-66510c1221dc {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:50:46 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:50:47 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:50:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:57 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:50:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:51:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:51:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:39 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:39 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:51:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.processutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71628) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:51:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=9089MB free_disk=26.445446014404297GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Instance 46e92122-765c-4ff1-9c13-66510c1221dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71628) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:51:42 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:51:43 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:51:43 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:51:43 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:51:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:44 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:45 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquired lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Forcefully refreshing network info cache for instance {{(pid=71628) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.objects.instance [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lazy-loading 'info_cache' on Instance uuid 46e92122-765c-4ff1-9c13-66510c1221dc {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.network.neutron [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [{"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Releasing lock "refresh_cache-46e92122-765c-4ff1-9c13-66510c1221dc" {{(pid=71628) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updated the network info_cache for instance {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 17:51:47 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:48 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:51:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:51:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:51:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:12 user nova-compute[71628]: INFO nova.compute.manager [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Terminating instance Apr 17 17:52:12 user nova-compute[71628]: DEBUG nova.compute.manager [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Start destroying the instance on the hypervisor. {{(pid=71628) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 17:52:12 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-unplugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] No waiting events found dispatching network-vif-unplugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.compute.manager [req-5319220a-bfdf-4cab-bb2d-ccadada0d44e req-842100ed-1476-42f1-9ea9-feba80f128b6 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-unplugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 for instance with task_state deleting. {{(pid=71628) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Instance destroyed successfully. Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.objects.instance [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lazy-loading 'resources' on Instance uuid 46e92122-765c-4ff1-9c13-66510c1221dc {{(pid=71628) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.virt.libvirt.vif [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T17:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-519140730',display_name='tempest-VolumesActionsTest-instance-519140730',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-519140730',id=25,image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T17:50:27Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d9fa2be78c04fdeafdacb7af1c7ef80',ramdisk_id='',reservation_id='r-16u2qn97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='82e42adf-a9f9-4d9b-9bd0-106a738b1690',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-1894276307',owner_user_name='tempest-VolumesActionsTest-1894276307-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T17:50:27Z,user_data=None,user_id='3999f6aff774419daca6c25a18ee9af7',uuid=46e92122-765c-4ff1-9c13-66510c1221dc,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converting VIF {"id": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "address": "fa:16:3e:f2:b9:e8", "network": {"id": "5a6aa082-6619-4f08-be06-828e931c35d5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-731505046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "1d9fa2be78c04fdeafdacb7af1c7ef80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfba079fc-43", "ovs_interfaceid": "fba079fc-43ec-4d3b-b81a-e581a865a4f6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.network.os_vif_util [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') {{(pid=71628) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG os_vif [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') {{(pid=71628) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba079fc-43, bridge=br-int, if_exists=True) {{(pid=71628) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:13 user nova-compute[71628]: INFO os_vif [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b9:e8,bridge_name='br-int',has_traffic_filtering=True,id=fba079fc-43ec-4d3b-b81a-e581a865a4f6,network=Network(5a6aa082-6619-4f08-be06-828e931c35d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfba079fc-43') Apr 17 17:52:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Deleting instance files /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc_del Apr 17 17:52:13 user nova-compute[71628]: INFO nova.virt.libvirt.driver [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Deletion of /opt/stack/data/nova/instances/46e92122-765c-4ff1-9c13-66510c1221dc_del complete Apr 17 17:52:13 user nova-compute[71628]: INFO nova.compute.manager [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Took 0.91 seconds to destroy the instance on the hypervisor. Apr 17 17:52:13 user nova-compute[71628]: DEBUG oslo.service.loopingcall [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71628) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.compute.manager [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Deallocating network for instance {{(pid=71628) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 17:52:13 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] deallocate_for_instance() {{(pid=71628) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.network.neutron [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:52:14 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Took 0.46 seconds to deallocate network for instance. Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-d892f02c-c768-4e19-a8c6-980c2f945a01 req-7eae49d1-6fc1-4c44-99c9-b3d6bcade54f service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-deleted-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:52:14 user nova-compute[71628]: INFO nova.compute.manager [req-d892f02c-c768-4e19-a8c6-980c2f945a01 req-7eae49d1-6fc1-4c44-99c9-b3d6bcade54f service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Neutron deleted interface fba079fc-43ec-4d3b-b81a-e581a865a4f6; detaching it from the instance and deleting it from the info cache Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.network.neutron [req-d892f02c-c768-4e19-a8c6-980c2f945a01 req-7eae49d1-6fc1-4c44-99c9-b3d6bcade54f service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Updating instance_info_cache with network_info: [] {{(pid=71628) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.compute.manager [req-d892f02c-c768-4e19-a8c6-980c2f945a01 req-7eae49d1-6fc1-4c44-99c9-b3d6bcade54f service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Detach interface failed, port_id=fba079fc-43ec-4d3b-b81a-e581a865a4f6, reason: Instance 46e92122-765c-4ff1-9c13-66510c1221dc could not be found. {{(pid=71628) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:52:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:14 user nova-compute[71628]: INFO nova.scheduler.client.report [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Deleted allocations for instance 46e92122-765c-4ff1-9c13-66510c1221dc Apr 17 17:52:14 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-fcad061a-2e38-4d1b-b0d2-6d988319151a tempest-VolumesActionsTest-1894276307 tempest-VolumesActionsTest-1894276307-project-member] Lock "46e92122-765c-4ff1-9c13-66510c1221dc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.663s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 17:52:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] Acquiring lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:15 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] Lock "46e92122-765c-4ff1-9c13-66510c1221dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:15 user nova-compute[71628]: DEBUG nova.compute.manager [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] No waiting events found dispatching network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 {{(pid=71628) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 17:52:15 user nova-compute[71628]: WARNING nova.compute.manager [req-2389fe19-77c8-4243-8bc8-f83bf9b40e4b req-e3b28076-f917-4c6f-99ef-132fbf1532d4 service nova] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Received unexpected event network-vif-plugged-fba079fc-43ec-4d3b-b81a-e581a865a4f6 for instance with vm_state deleted and task_state None. Apr 17 17:52:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:52:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:28 user nova-compute[71628]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71628) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 17:52:28 user nova-compute[71628]: INFO nova.compute.manager [-] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] VM Stopped (Lifecycle Event) Apr 17 17:52:28 user nova-compute[71628]: DEBUG nova.compute.manager [None req-deba6ea9-61b6-4afb-884e-3fc03c5f8cb6 None None] [instance: 46e92122-765c-4ff1-9c13-66510c1221dc] Checking state {{(pid=71628) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 17:52:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:52:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:38 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:40 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:41 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:41 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71628) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Auditing locally available compute resources for user (node: user) {{(pid=71628) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 17:52:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:52:42 user nova-compute[71628]: WARNING nova.virt.libvirt.driver [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Hypervisor/Node resource view: name=user free_ram=9174MB free_disk=26.46320343017578GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71628) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71628) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.provider_tree [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed in ProviderTree for provider: d1bd72d4-058c-4e3c-95bb-8ce522bd5058 {{(pid=71628) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.scheduler.client.report [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Inventory has not changed for provider d1bd72d4-058c-4e3c-95bb-8ce522bd5058 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71628) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG nova.compute.resource_tracker [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Compute_service record updated for user:user {{(pid=71628) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 17:52:42 user nova-compute[71628]: DEBUG oslo_concurrency.lockutils [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s {{(pid=71628) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 17:52:43 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:45 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:45 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:48 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Starting heal instance info cache {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 17:52:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Rebuilding the list of instances to heal {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 17:52:48 user nova-compute[71628]: DEBUG nova.compute.manager [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Didn't find any instances for network info cache update. {{(pid=71628) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 17:52:48 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:52:50 user nova-compute[71628]: DEBUG oslo_service.periodic_task [None req-b4314203-3e1f-4646-b4a4-5e83146bf025 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71628) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 17:52:53 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:52:58 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:53:03 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:53:08 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:11 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:13 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:53:18 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:23 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71628) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71628) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 17:53:28 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 17:53:33 user nova-compute[71628]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71628) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}}