-
Notifications
You must be signed in to change notification settings - Fork 816
[debug dump util] Base Skeleton and Click Class added #1668
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
29 commits
Select commit
Hold shift + click to select a range
e2a2011
Base Skeleton and CLI toplevl added
vivekrnv b5caed2
Multi Asic Support and UT's Added
vivekrnv 02868ac
Minor Changes
vivekrnv 8bcb06a
Added entrypoint in setup.py
vivekrnv 7bf4b86
Techsupport helper option added
vivekrnv 96aace4
mock DB updated
vivekrnv 75eb09c
Merge Conflict Resolved
vivekrnv 3b0c88b
Excluded paths in DeepDiff to make the test robust
vivekrnv 595396c
Merge branch 'master' of https://github.com/vivekreddynv/sonic-utilit…
vivekrnv d229f3b
setup.py update
vivekrnv 60a78da
state_db mock fixed and test updated
vivekrnv 8374b94
bash autocompletion code added
vivekrnv 73bd445
redis_match name change updated
vivekrnv f1ce8b2
Final Changes before review made
vivekrnv c62e769
Comments addressed and no-split option added
vivekrnv 2e19586
Removed the --no-split option
vivekrnv 5a70a5b
Removed split option and minor changes to formattig
vivekrnv 5778b14
Comments Addressed
vivekrnv efb1d59
Minor Test Issue fixed
vivekrnv 5820972
Moved mock files to a diff dir
vivekrnv 624e722
Merge branch 'master' of https://github.com/vivekreddynv/sonic-utilit…
vivekrnv 4c3b775
Command Ref updated
vivekrnv 0eb278a
Merge branch 'master' of https://github.com/Azure/sonic-utilities int…
vivekrnv 164a2ee
Unified the Redis Connectivity to RedisSource Class
vivekrnv 93ff843
HELP string and sonic_py_common updated
vivekrnv 9cdf76e
PEP-8 issues handled
vivekrnv bf4ea75
Final pep-8 changes
vivekrnv 82be3c4
Cached vidtorid extract and made connpool optim related changes
vivekrnv 4266484
Minor Change
vivekrnv File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,229 @@ | ||
| import os | ||
| import sys | ||
| import json | ||
| import re | ||
| import click | ||
| from tabulate import tabulate | ||
| from sonic_py_common import multi_asic | ||
| from utilities_common.constants import DEFAULT_NAMESPACE | ||
| from dump.match_infra import RedisSource, JsonSource, ConnectionPool | ||
| from dump import plugins | ||
|
|
||
|
|
||
| # Autocompletion Helper | ||
| def get_available_modules(ctx, args, incomplete): | ||
| return [k for k in plugins.dump_modules.keys() if incomplete in k] | ||
|
|
||
|
|
||
| # Display Modules Callback | ||
| def show_modules(ctx, param, value): | ||
| if not value or ctx.resilient_parsing: | ||
| return | ||
| header = ["Module", "Identifier"] | ||
| display = [] | ||
| for mod in plugins.dump_modules: | ||
| display.append((mod, plugins.dump_modules[mod].ARG_NAME)) | ||
| click.echo(tabulate(display, header)) | ||
| ctx.exit() | ||
|
|
||
|
|
||
| @click.group() | ||
| def dump(): | ||
| pass | ||
|
|
||
|
|
||
| @dump.command() | ||
| @click.pass_context | ||
| @click.argument('module', required=True, type=str, autocompletion=get_available_modules) | ||
| @click.argument('identifier', required=True, type=str) | ||
| @click.option('--show', '-s', is_flag=True, default=False, expose_value=False, | ||
| callback=show_modules, help='Display Modules Available', is_eager=True) | ||
| @click.option('--db', '-d', multiple=True, | ||
| help='Only dump from these Databases or the CONFIG_FILE') | ||
| @click.option('--table', '-t', is_flag=True, default=False, | ||
| help='Print in tabular format', show_default=True) | ||
| @click.option('--key-map', '-k', is_flag=True, default=False, show_default=True, | ||
| help="Only fetch the keys matched, don't extract field-value dumps") | ||
| @click.option('--verbose', '-v', is_flag=True, default=False, show_default=True, | ||
| help="Prints any intermediate output to stdout useful for dev & troubleshooting") | ||
| @click.option('--namespace', '-n', default=DEFAULT_NAMESPACE, type=str, | ||
| show_default=True, help='Dump the redis-state for this namespace.') | ||
| def state(ctx, module, identifier, db, table, key_map, verbose, namespace): | ||
| """ | ||
| Dump the current state of the identifier for the specified module from Redis DB or CONFIG_FILE | ||
| """ | ||
| if not multi_asic.is_multi_asic() and namespace != DEFAULT_NAMESPACE: | ||
| click.echo("Namespace option is not valid for a single-ASIC device") | ||
| ctx.exit() | ||
|
|
||
| if multi_asic.is_multi_asic() and (namespace != DEFAULT_NAMESPACE and namespace not in multi_asic.get_namespace_list()): | ||
| click.echo("Namespace option is not valid. Choose one of {}".format(multi_asic.get_namespace_list())) | ||
| ctx.exit() | ||
|
|
||
| if module not in plugins.dump_modules: | ||
| click.echo("No Matching Plugin has been Implemented") | ||
| ctx.exit() | ||
|
|
||
| if verbose: | ||
| os.environ["VERBOSE"] = "1" | ||
| else: | ||
| os.environ["VERBOSE"] = "0" | ||
|
|
||
| ctx.module = module | ||
| obj = plugins.dump_modules[module]() | ||
|
|
||
| if identifier == "all": | ||
| ids = obj.get_all_args(namespace) | ||
| else: | ||
| ids = identifier.split(",") | ||
|
|
||
| params = {} | ||
| collected_info = {} | ||
| params['namespace'] = namespace | ||
| for arg in ids: | ||
| params[plugins.dump_modules[module].ARG_NAME] = arg | ||
| collected_info[arg] = obj.execute(params) | ||
|
|
||
| if len(db) > 0: | ||
| collected_info = filter_out_dbs(db, collected_info) | ||
arlakshm marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| vidtorid = extract_rid(collected_info, namespace) | ||
|
|
||
| if not key_map: | ||
| collected_info = populate_fv(collected_info, module, namespace) | ||
|
|
||
| for id in vidtorid.keys(): | ||
| collected_info[id]["ASIC_DB"]["vidtorid"] = vidtorid[id] | ||
|
|
||
| print_dump(collected_info, table, module, identifier, key_map) | ||
|
|
||
| return | ||
|
|
||
|
|
||
| def extract_rid(info, ns): | ||
| r = RedisSource(ConnectionPool()) | ||
| r.connect("ASIC_DB", ns) | ||
| vidtorid = {} | ||
| vid_cache = {} # Cache Entries to reduce number of Redis Calls | ||
| for arg in info.keys(): | ||
| mp = get_v_r_map(r, info[arg], vid_cache) | ||
| if mp: | ||
| vidtorid[arg] = mp | ||
| return vidtorid | ||
|
|
||
vivekrnv marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| def get_v_r_map(r, single_dict, vid_cache): | ||
| v_r_map = {} | ||
| asic_obj_ptrn = "ASIC_STATE:.*:oid:0x\w{1,14}" | ||
|
|
||
| if "ASIC_DB" in single_dict and "keys" in single_dict["ASIC_DB"]: | ||
| for redis_key in single_dict["ASIC_DB"]["keys"]: | ||
| if re.match(asic_obj_ptrn, redis_key): | ||
| matches = re.findall(r"oid:0x\w{1,14}", redis_key) | ||
| if matches: | ||
| vid = matches[0] | ||
| if vid in vid_cache: | ||
| rid = vid_cache[vid] | ||
| else: | ||
| rid = r.hget("ASIC_DB", "VIDTORID", vid) | ||
| vid_cache[vid] = rid | ||
| v_r_map[vid] = rid if rid else "Real ID Not Found" | ||
| return v_r_map | ||
|
|
||
|
|
||
| # Filter dbs which are not required | ||
| def filter_out_dbs(db_list, collected_info): | ||
| args_ = list(collected_info.keys()) | ||
| for arg in args_: | ||
| dbs = list(collected_info[arg].keys()) | ||
| for db in dbs: | ||
| if db not in db_list: | ||
| del collected_info[arg][db] | ||
| return collected_info | ||
|
|
||
|
|
||
vivekrnv marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| def populate_fv(info, module, namespace): | ||
| all_dbs = set() | ||
| for id in info.keys(): | ||
| for db_name in info[id].keys(): | ||
| all_dbs.add(db_name) | ||
|
|
||
| db_cfg_file = JsonSource() | ||
| db_conn = ConnectionPool().initialize_connector(namespace) | ||
| for db_name in all_dbs: | ||
| if db_name is "CONFIG_FILE": | ||
| db_cfg_file.connect(plugins.dump_modules[module].CONFIG_FILE, namespace) | ||
| else: | ||
| db_conn.connect(db_name) | ||
|
|
||
| final_info = {} | ||
| for id in info.keys(): | ||
| final_info[id] = {} | ||
| for db_name in info[id].keys(): | ||
| final_info[id][db_name] = {} | ||
| final_info[id][db_name]["keys"] = [] | ||
| final_info[id][db_name]["tables_not_found"] = info[id][db_name]["tables_not_found"] | ||
| for key in info[id][db_name]["keys"]: | ||
| if db_name is "CONFIG_FILE": | ||
| fv = db_dict[db_name].get(db_name, key) | ||
| else: | ||
| fv = db_conn.get_all(db_name, key) | ||
| final_info[id][db_name]["keys"].append({key: fv}) | ||
|
|
||
| return final_info | ||
|
|
||
|
|
||
| def get_dict_str(key_obj): | ||
| table = [] | ||
| for pair in key_obj.items(): | ||
| table.append(list(pair)) | ||
| return tabulate(table, headers=["field", "value"], tablefmt="psql") | ||
|
|
||
|
|
||
| # print dump | ||
| def print_dump(collected_info, table, module, identifier, key_map): | ||
| if not table: | ||
| click.echo(json.dumps(collected_info, indent=4)) | ||
| return | ||
|
|
||
| top_header = [plugins.dump_modules[module].ARG_NAME, "DB_NAME", "DUMP"] | ||
| final_collection = [] | ||
| for ids in collected_info.keys(): | ||
| for db in collected_info[ids].keys(): | ||
| total_info = "" | ||
|
|
||
| if collected_info[ids][db]["tables_not_found"]: | ||
| tabulate_fmt = [] | ||
| for tab in collected_info[ids][db]["tables_not_found"]: | ||
| tabulate_fmt.append([tab]) | ||
| total_info += tabulate(tabulate_fmt, ["Tables Not Found"], tablefmt="grid") | ||
| total_info += "\n" | ||
|
|
||
| if not key_map: | ||
| values = [] | ||
| hdrs = ["Keys", "field-value pairs"] | ||
| for key_obj in collected_info[ids][db]["keys"]: | ||
| if isinstance(key_obj, dict) and key_obj: | ||
| key = list(key_obj.keys())[0] | ||
| values.append([key, get_dict_str(key_obj[key])]) | ||
| total_info += str(tabulate(values, hdrs, tablefmt="grid")) | ||
| else: | ||
| temp = [] | ||
| for key_ in collected_info[ids][db]["keys"]: | ||
| temp.append([key_]) | ||
| total_info += str(tabulate(temp, headers=["Keys Collected"], tablefmt="grid")) | ||
|
|
||
| total_info += "\n" | ||
| if "vidtorid" in collected_info[ids][db]: | ||
| temp = [] | ||
| for pair in collected_info[ids][db]["vidtorid"].items(): | ||
| temp.append(list(pair)) | ||
| total_info += str(tabulate(temp, headers=["vid", "rid"], tablefmt="grid")) | ||
| final_collection.append([ids, db, total_info]) | ||
|
|
||
| click.echo(tabulate(final_collection, top_header, tablefmt="grid")) | ||
| return | ||
|
|
||
|
|
||
| if __name__ == '__main__': | ||
| dump() | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| _dump_completion() { | ||
| COMPREPLY=( $( env COMP_WORDS="${COMP_WORDS[*]}" \ | ||
| COMP_CWORD=$COMP_CWORD \ | ||
| _DUMP_COMPLETE=complete $1 ) ) | ||
| return 0 | ||
| } | ||
|
|
||
| complete -F _dump_completion -o default dump |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.