Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 9 additions & 3 deletions ansible/config_sonic_basedon_testbed.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@
set_fact:
dual_tor_facts: {}
when: "'dualtor' not in topo"

- name: gather dual ToR information
dual_tor_facts: hostname="{{ inventory_hostname }}" testbed_facts="{{ testbed_facts }}" hostvars="{{ hostvars }}" vm_config="{{ vm_topo_config }}"
delegate_to: localhost
Expand All @@ -78,7 +78,7 @@
- name: generate y_cable simulator driver
include_tasks: dualtor/config_y_cable_simulator.yml
when: "'dualtor' in topo"

- name: set default vm file path
set_fact:
vm_file: veos
Expand Down Expand Up @@ -121,7 +121,7 @@
when: "('host_interfaces_by_dut' in vm_topo_config) and ('tor' in vm_topo_config['dut_type'] | lower)"

- name: find any tunnel configurations
tunnel_config:
tunnel_config:
vm_topo_config: "{{ vm_topo_config }}"
tunnel_config: "{{ tunnel_config | default(None) }}"
delegate_to: localhost
Expand Down Expand Up @@ -334,4 +334,10 @@
become: true
shell: config save -y
when: save is defined and save|bool == true

- name: cleanup cached facts
shell: python ../tests/common/cache/facts_cache.py {{ inventory_hostname }}
delegate_to: localhost
ignore_errors: true

when: deploy is defined and deploy|bool == true
4 changes: 4 additions & 0 deletions tests/common/cache/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
from .facts_cache import FactsCache
from .facts_cache import cached

__all__ = [FactsCache, cached]
98 changes: 98 additions & 0 deletions tests/common/cache/facts_cache.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
# Facts Cache

To run test scripts, we frequently need to gather facts from various devices again and again. Most of the facts gatherings need to run some commands on remote devices through SSH connection, parse the commands output and return the results. Most of the time, the facts to be gathered are unchanged, like DUT HWSKU, platform, etc. For the less frequently changed facts, we can cache them for quicker access to save a lot of overhead for gathering them each time. Then we can improve the overall time required for running all the tests.

# Cache Design

To simplify the design, we use local (sonic-mgmt container) json files to cache information. Although reading from local file is slower than reading from memory, it is still much faster than running commands on remote host through SSH connection and parsing the output. A dedicated folder (by default `tests/_cache`) is used to store the cached json files. The json files are grouped into sub-folders by hostname. For example, file `tests/_cache/vlab-01/basic_facts.json` caches some basic facts of host `vlab-01`.

The cache function is mainly implemented in below file:
```
sonic-mgmt/tests/common/cache/facts_cache.py
```

A singleton class FactsCache is implemented. This class supports these interfaces:
* `read(self, hostname, key)`
* `write(self, hostname, key, value)`
* `cleanup(self, hostname=None)`

The FactsCache class has a dictionary for holding the cached facts in memory. When the `read` method is called, it firstly read `self._cache[hostname][key]` from memory. If not found, it will try to load the json file. If anything wrong with the json file, it will return an empty dictionary.

When the `write` method is called, it will store facts in memory like `self._cache[hostname][key] = value`. Then it will also try to dump the facts to json file `tests/_cache/<hostname>/<key>.json`.

# Clean up facts

The `cleanup` function is for cleaning the stored json files.

When the `facts_cache.py` script is directly executed with an argument, it will call the `cleanup` function to remove stored json files for host specified by the first argument. If it is executed without argument, then all the stored json files will be removed.

When `testbed-cli.sh deploy-mg` is executed for specified testbed, the ansible playbook will run `facts_cache.py` to remove stored json files for current testbed as well.

# Use cache

There are two ways to use the cache function.

## Use decorator `facts_cache.py::cached`

```
from tests.common.cache import cached

class SonicHost(AnsibleHostBase):

...

@cached(name='basic_facts')
def _gather_facts(self):
...
```

The `cached` decorator supports name argument which correspond to the `key` argument of `read(self, hostname, key)` and `write(self, hostname, key, value)`.
The `cached` decorator can only be used on an bound method of class which is subclass of AnsibleHostBase.

## Explicitly use FactsCache

* Import FactsCache and grab the cache instance

```
from tests.common.cache import FactsCache

cache = FactsCache()
```

* Use code like below

```

def get_some_facts(self, *args):
cached_facts = cache.read(self.hostname, 'some_facts')
if cached_facts:
return cached facts

# Code to gather the facts from host.
facts = self._do_stuff_to_gather_facts()
cache.write(self.hostname, 'some_facts', facts)
return facts

```

# Cached facts lifecycle in nightly test

* During `testbed-cli.sh deploy-mg` step of testbed deployment, all cached json files of current DUT are removed.
* Use `pytest test_script1.py test_script2.py` to run one set of test scripts.
* First encounter of cache enabled facts:
* No cache in memory.
* No cache in json file.
* Gather from remote host.
* Store in memory.
* Store in json file.
* Return the facts.
* Subsequent encounter of cache enabled facts.
* Cache in memory, read from memory. Return the facts.
* Use `pytest test_script3.py test_script4.py` to run another set of test scripts.
* First encounter of cache enabled facts:
* No cache in memory.
* Cache in json file. Load from json file.
* Store in memory.
* Return the facts.
* Subsequent encounter of cache enabled facts.
* Cache in memory, read from memory. Return the facts.
172 changes: 172 additions & 0 deletions tests/common/cache/facts_cache.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
from __future__ import print_function, division, absolute_import

import logging
import json
import os
import shutil
import sys

from collections import defaultdict
from threading import Lock

from six import with_metaclass

logger = logging.getLogger(__name__)

CURRENT_PATH = os.path.realpath(__file__)
CACHE_LOCATION = os.path.join(CURRENT_PATH, '../../../_cache')

SIZE_LIMIT = 1000000000 # 1G bytes, max disk usage allowed by cache
ENTRY_LIMIT = 1000000 # Max number of json files allowed in cache.


class Singleton(type):

_instances = {}
_lock = Lock()

def __call__(cls, *args, **kwargs):
with cls._lock:
if cls not in cls._instances:
instance = super(Singleton, cls).__call__(*args, **kwargs)
cls._instances[cls] = instance
return cls._instances[cls]


class FactsCache(with_metaclass(Singleton, object)):
"""Singleton class for reading from cache and write to cache.

Used singleton design pattern. Only a single instance of this class can be initialized.

Args:
with_metaclass ([function]): Python 2&3 compatible function from the six library for adding metaclass.
"""
def __init__(self, cache_location=CACHE_LOCATION):
self._cache_location = os.path.abspath(cache_location)
self._cache = defaultdict(dict)

def _check_usage(self):
"""Check cache usage, raise exception if usage exceeds the limitations.
"""
total_size = 0
total_entries = 0
for root, _, files in os.walk(self._cache_location):
for f in files:
fp = os.path.join(root, f)
total_size += os.path.getsize(fp)
total_entries += 1

if total_size > SIZE_LIMIT or total_entries > ENTRY_LIMIT:
msg = 'Cache usage exceeds limitations. total_size={}, SIZE_LIMIT={}, total_entries={}, ENTRY_LIMIT={}' \
.format(total_size, SIZE_LIMIT, total_entries, ENTRY_LIMIT)
raise Exception(msg)

def read(self, hostname, key):
"""Read cached facts.

Args:
hostname (str): Hostname.
key (str): Name of cached facts.

Returns:
obj: Cached object, usually a dictionary.
"""
# Lazy load
if hostname in self._cache and key in self._cache[hostname]:
logger.info('Read cached facts "{}.{}"'.format(hostname, key))
return self._cache[hostname][key]
else:
facts_file = os.path.join(self._cache_location, '{}/{}.json'.format(hostname, key))
try:
with open(facts_file) as f:
self._cache[hostname][key] = json.load(f)
logger.info('Loaded cached facts "{}.{}" from {}'.format(hostname, key, facts_file))
return self._cache[hostname][key]
except (IOError, ValueError) as e:
logger.error('Load json file "{}" failed with exception: {}'\
.format(os.path.abspath(facts_file), repr(e)))
return {}

def write(self, hostname, key, value):
"""Store facts to cache.

Args:
hostname (str): Hostname.
key (str): Name of cached facts.
value (obj): Value of cached facts. Usually a dictionary.

Returns:
boolean: Caching facts is successful or not.
"""
self._check_usage()
facts_file = os.path.join(self._cache_location, '{}/{}.json'.format(hostname, key))
try:
host_folder = os.path.join(self._cache_location, hostname)
if not os.path.exists(host_folder):
logger.info('Create cache dir {}'.format(host_folder))
os.makedirs(host_folder)

with open(facts_file, 'w') as f:
json.dump(value, f, indent=2)
self._cache[hostname][key] = value
logger.info('Cached facts "{}.{}" under {}'.format(hostname, key, host_folder))
return True
except (IOError, ValueError) as e:
logger.error('Dump json file "{}" failed with exception: {}'.format(facts_file, repr(e)))
return False

def cleanup(self, hostname=None):
"""Cleanup cached json files.

Args:
hostname (str, optional): Hostname. Defaults to None.
"""
if hostname:
sub_items = os.listdir(self._cache_location)
if hostname in sub_items:
host_folder = os.path.join(self._cache_location, hostname)
logger.info('Clean up cached facts under "{}"'.format(host_folder))
shutil.rmtree(host_folder)
else:
logger.error('Sub-folder for host "{}" is not found'.format(hostname))
else:
logger.info('Clean up all cached facts under "{}"'.format(self._cache_location))
shutil.rmtree(self._cache_location)


def cached(name):
"""Decorator for enabling cache for facts.

The cached facts are to be stored by <name>.json. Because the cached json files must be stored under subfolder for
each host, this decorator can only be used for bound method of class which is subclass of AnsibleHostBase.

Args:
name ([str]): Name of the cached facts.

Returns:
[function]: Decorator function.
"""
cache = FactsCache()
def decorator(target):
def wrapper(*args, **kwargs):
hostname = getattr(args[0], 'hostname', None)
if not hostname or not isinstance(hostname, str):
raise Exception('Decorator is only applicable to bound method of class AnsibleHostBase and its sub-classes')
cached_facts = cache.read(hostname, name)
if cached_facts:
return cached_facts
else:
facts = target(*args, **kwargs)
cache.write(hostname, name, facts)
return facts
return wrapper
return decorator


if __name__ == '__main__':
cache = FactsCache()
if len(sys.argv) == 2:
hostname = sys.argv[1]
else:
hostname = None
cache.cleanup(hostname)
Loading