-
Notifications
You must be signed in to change notification settings - Fork 235
Refactor: Separate Pipeline Management into Serverless Application #424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
29 commits
Select commit
Hold shift + click to select a range
305ca55
Refactor: Pipeline Management State Machine
StewartW 6d1451b
Updating documentation
StewartW 2a7709c
Merge branch 'master' of github.com:awslabs/aws-deployment-framework …
StewartW 9e52c9a
Merge branch 'master' of github.com:awslabs/aws-deployment-framework …
StewartW 6050abe
Fixing yamllint errors
StewartW 7f9d1c9
final tox issues:
StewartW b3b7a10
Update docs/technical-guide.md
StewartW e3dae9b
Fixing some issues with testing
StewartW 1eb4088
Merge branch 'refactor/pipeline-management' of github.com:StewartW/aw…
StewartW 9fd1626
Enabling create repo based on config value
StewartW 88e6b0f
Code Review Suggestions
StewartW 434b2e3
Merge branch 'master' of github.com:awslabs/aws-deployment-framework …
StewartW aef4b72
Copy Paste Error
StewartW 1a8d7ea
removing packaged.yml and fixing trailing spaces
StewartW 29a4fee
Missing default scm config value
StewartW 9394231
Merge branch 'master' of github.com:awslabs/aws-deployment-framework …
StewartW 56d45ec
Apply suggestions from code review
StewartW 0115d29
Apply suggestions from code review
StewartW b9bbc9e
Changing permissions of src/lambda_codebase/initial_commit/bootstrap_…
StewartW 0fc5b50
Merge branch 'refactor/pipeline-management' of github.com:StewartW/aw…
StewartW 7950409
Passing thru log level param
StewartW 9e11f20
Restricting IAM permissions to prefix
StewartW 612855e
Updating Function names
StewartW d62332d
Apply suggestions from code review
sbkok a0e442d
Code Review Suggestions
StewartW e7c7254
Merge branch 'refactor/pipeline-management' of github.com:StewartW/aw…
StewartW f3dafaa
removing condition
StewartW 8f7323c
Add wave generation test from #484
sbkok b46750f
Make use of Temp.Directory to fetch the deployment map files
sbkok File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,8 +1,24 @@ | ||
| ## Technical Guide | ||
| ### Introduction | ||
| # Technical Guide | ||
| ## Introduction | ||
| This document is intended to give insight into how the AWS Deployment Framework works under the hood. | ||
|
|
||
| ### High Level Overview - AWS Deployment Framework Bootstrap Repository | ||
| ## High Level Overview - AWS Deployment Framework Bootstrap Repository | ||
| The AWS Deployment Framework Bootstrap Repository aka "Bootstrap Repo" is where the source code used by ADF lives. The bootstrap repo is also where your accounts, OU layout and base templates are defined. | ||
| The flow below is a high level overview of what happens when a change is committed to this repository. | ||
|  | ||
|
|
||
| ### Account Management State Machine | ||
| The Account Managment State Machine is triggered by S3 PUT events to the ADF Accounts bucket. | ||
| Below is a diagram detailing the components of the standard state machine. This state machine is defined in `src/account_processing.yml` and the lambda functions code is location in `src/lambda_codebase/account_processing` | ||
|  | ||
|
|
||
|
|
||
| ## High Level Overview - AWS Deployment Framework Pipeline Repository | ||
| The AWS Deployment Framework Pipeline Repository aka "Pipeline Rep" is where the deployment map definitions live. It typically exists in CodeCommit within your Deployment Account(s). | ||
| The diagram below details what happens when a commit is pushed to this repository. | ||
|  | ||
|
|
||
| ### Pipeline Management State Machine | ||
| The Pipeline Management State machine is triggered by S3 PUT events to the ADF Pipelines bucket. This state machine is responsible for expanding the deployment map, resolving the targets, creating pipeline definitions (JSON objects that detail the source(s) and stages involved and the targets) and then generating CDK stacks off of the definitions. | ||
|
|
||
| It additionally covers the deletion of stale pipelines. A Stale pipeline is any pipeline that has a definition but does not exist in a deployment map. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
60 changes: 60 additions & 0 deletions
60
...ory/adf-bootstrap/deployment/lambda_codebase/pipeline_management/create_or_update_rule.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,60 @@ | ||
| """ | ||
| Pipeline Management Lambda Function | ||
| Creates or Updates an Event Rule for forwarding events | ||
| If the source account != the Deplyment account | ||
| """ | ||
|
|
||
| import os | ||
| import boto3 | ||
|
|
||
| from cache import Cache | ||
| from rule import Rule | ||
| from logger import configure_logger | ||
| from cloudwatch import ADFMetrics | ||
|
|
||
|
|
||
| LOGGER = configure_logger(__name__) | ||
| DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"] | ||
| DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"] | ||
| PIPELINE_MANAGEMENT_STATEMACHINE = os.getenv("PIPELINE_MANAGEMENT_STATEMACHINE_ARN") | ||
| CLOUDWATCH = boto3.client("cloudwatch") | ||
| METRICS = ADFMetrics(CLOUDWATCH, "PIPELINE_MANAGEMENT/RULE") | ||
|
|
||
| _cache = None | ||
|
|
||
|
|
||
| def lambda_handler(pipeline, _): | ||
| """Main Lambda Entry point""" | ||
|
|
||
| # pylint: disable=W0603 | ||
| # Global variable here to cache across lambda execution runtimes. | ||
| global _cache | ||
| if not _cache: | ||
| _cache = Cache() | ||
| METRICS.put_metric_data( | ||
| {"MetricName": "CacheInitalised", "Value": 1, "Unit": "Count"} | ||
| ) | ||
|
|
||
| LOGGER.info(pipeline) | ||
|
|
||
| _source_account_id = ( | ||
| pipeline.get("default_providers", {}) | ||
| .get("source", {}) | ||
| .get("properties", {}) | ||
| .get("account_id", {}) | ||
| ) | ||
| if ( | ||
| _source_account_id | ||
| and int(_source_account_id) != int(DEPLOYMENT_ACCOUNT_ID) | ||
| and not _cache.check(_source_account_id) | ||
| ): | ||
| rule = Rule(pipeline["default_providers"]["source"]["properties"]["account_id"]) | ||
| rule.create_update() | ||
| _cache.add( | ||
| pipeline["default_providers"]["source"]["properties"]["account_id"], True | ||
| ) | ||
| METRICS.put_metric_data( | ||
| {"MetricName": "CreateOrUpdate", "Value": 1, "Unit": "Count"} | ||
| ) | ||
|
|
||
| return pipeline |
56 changes: 56 additions & 0 deletions
56
...ository/adf-bootstrap/deployment/lambda_codebase/pipeline_management/create_repository.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,56 @@ | ||
| """ | ||
| Pipeline Management Lambda Function | ||
| Creates or Updates a CodeCommit Repository | ||
| """ | ||
|
|
||
| import os | ||
| import boto3 | ||
| from repo import Repo | ||
|
|
||
| from logger import configure_logger | ||
| from cloudwatch import ADFMetrics | ||
| from parameter_store import ParameterStore | ||
|
|
||
|
|
||
| CLOUDWATCH = boto3.client("cloudwatch") | ||
| METRICS = ADFMetrics(CLOUDWATCH, "PIPELINE_MANAGEMENT/REPO") | ||
| LOGGER = configure_logger(__name__) | ||
| DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"] | ||
| DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"] | ||
|
|
||
|
|
||
| def lambda_handler(pipeline, _): | ||
| """Main Lambda Entry point""" | ||
| parameter_store = ParameterStore(DEPLOYMENT_ACCOUNT_REGION, boto3) | ||
| auto_create_repositories = parameter_store.fetch_parameter( | ||
| "auto_create_repositories" | ||
| ) | ||
| LOGGER.info(auto_create_repositories) | ||
| if auto_create_repositories == "enabled": | ||
| code_account_id = ( | ||
| pipeline.get("default_providers", {}) | ||
| .get("source", {}) | ||
| .get("properties", {}) | ||
| .get("account_id", {}) | ||
| ) | ||
| has_custom_repo = ( | ||
| pipeline.get("default_providers", {}) | ||
| .get("source", {}) | ||
| .get("properties", {}) | ||
| .get("repository", {}) | ||
| ) | ||
| if ( | ||
| auto_create_repositories | ||
| and code_account_id | ||
| and str(code_account_id).isdigit() | ||
| and not has_custom_repo | ||
| ): | ||
| repo = Repo( | ||
| code_account_id, pipeline.get("name"), pipeline.get("description") | ||
| ) | ||
| repo.create_update() | ||
| METRICS.put_metric_data( | ||
| {"MetricName": "CreateOrUpdate", "Value": 1, "Unit": "Count"} | ||
| ) | ||
|
|
||
| return pipeline |
117 changes: 117 additions & 0 deletions
117
.../adf-bootstrap/deployment/lambda_codebase/pipeline_management/generate_pipeline_inputs.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,117 @@ | ||
| """ | ||
| Pipeline Management Lambda Function | ||
| Generates Pipeline Inputs | ||
| """ | ||
|
|
||
| import os | ||
| import boto3 | ||
|
|
||
| from pipeline import Pipeline | ||
| from target import Target, TargetStructure | ||
| from organizations import Organizations | ||
| from parameter_store import ParameterStore | ||
| from sts import STS | ||
| from logger import configure_logger | ||
| from partition import get_partition | ||
|
|
||
|
|
||
| LOGGER = configure_logger(__name__) | ||
| DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"] | ||
| DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"] | ||
| ROOT_ACCOUNT_ID = os.environ["ROOT_ACCOUNT_ID"] | ||
|
|
||
|
|
||
| def store_regional_parameter_config(pipeline, parameter_store): | ||
| """ | ||
| Responsible for storing the region information for specific | ||
| pipelines. These regions are defined in the deployment_map | ||
| either as top level regions for a pipeline or stage specific regions | ||
| """ | ||
| if pipeline.top_level_regions: | ||
| parameter_store.put_parameter( | ||
| f"/deployment/{pipeline.name}/regions", | ||
| str(list(set(pipeline.top_level_regions))), | ||
| ) | ||
| return | ||
|
|
||
| parameter_store.put_parameter( | ||
| f"/deployment/{pipeline.name}/regions", | ||
| str(list(set(Pipeline.flatten_list(pipeline.stage_regions)))), | ||
| ) | ||
|
|
||
|
|
||
| def fetch_required_ssm_params(regions): | ||
| output = {} | ||
| for region in regions: | ||
| parameter_store = ParameterStore(region, boto3) | ||
| output[region] = { | ||
| "s3": parameter_store.fetch_parameter( | ||
| f"/cross_region/s3_regional_bucket/{region}" | ||
| ), | ||
| "kms": parameter_store.fetch_parameter(f"/cross_region/kms_arn/{region}"), | ||
| } | ||
| if region == DEPLOYMENT_ACCOUNT_REGION: | ||
| output[region]["modules"] = parameter_store.fetch_parameter( | ||
| "deployment_account_bucket" | ||
| ) | ||
StewartW marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| output['default_scm_branch'] = parameter_store.fetch_parameter('default_scm_branch') | ||
| return output | ||
|
|
||
|
|
||
| def generate_pipeline_inputs(pipeline, organizations, parameter_store): | ||
| data = {} | ||
| pipeline_object = Pipeline(pipeline) | ||
| regions = [] | ||
| for target in pipeline.get("targets", []): | ||
| target_structure = TargetStructure(target) | ||
| for step in target_structure.target: | ||
| regions = step.get( | ||
| "regions", pipeline.get("regions", DEPLOYMENT_ACCOUNT_REGION) | ||
| ) | ||
| paths_tags = [] | ||
| for path in step.get("path", []): | ||
| paths_tags.append(path) | ||
| if step.get("tags") is not None: | ||
| paths_tags.append(step.get("tags", {})) | ||
| for path_or_tag in paths_tags: | ||
| pipeline_object.stage_regions.append(regions) | ||
| pipeline_target = Target( | ||
| path_or_tag, target_structure, organizations, step, regions | ||
| ) | ||
| pipeline_target.fetch_accounts_for_target() | ||
| # Targets should be a list of lists. | ||
|
|
||
| # Note: This is a big shift away from how ADF handles targets natively. | ||
| # Previously this would be a list of [accountId(s)] it now returns a list of [[account_ids], [account_ids]] | ||
| # for the sake of consistency we should probably think of a target consisting of multiple "waves". So if you see | ||
| # any reference to a wave going forward it will be the individual batch of account ids | ||
| pipeline_object.template_dictionary["targets"].append( | ||
| list(target_structure.generate_waves()), | ||
| ) | ||
|
|
||
| if DEPLOYMENT_ACCOUNT_REGION not in regions: | ||
| pipeline_object.stage_regions.append(DEPLOYMENT_ACCOUNT_REGION) | ||
|
|
||
| pipeline_object.generate_input() | ||
| data["ssm_params"] = fetch_required_ssm_params( | ||
| pipeline_object.input["regions"] or [DEPLOYMENT_ACCOUNT_REGION] | ||
| ) | ||
| data["input"] = pipeline_object.input | ||
sbkok marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| data['input']['default_scm_branch'] = data["ssm_params"].get('default_scm_branch') | ||
| store_regional_parameter_config(pipeline_object, parameter_store) | ||
| return data | ||
|
|
||
|
|
||
| def lambda_handler(pipeline, _): | ||
| """Main Lambda Entry point""" | ||
| parameter_store = ParameterStore(DEPLOYMENT_ACCOUNT_REGION, boto3) | ||
| sts = STS() | ||
| role = sts.assume_cross_account_role( | ||
| f'arn:{get_partition(DEPLOYMENT_ACCOUNT_REGION)}:iam::{ROOT_ACCOUNT_ID}:role/{parameter_store.fetch_parameter("cross_account_access_role")}-readonly', | ||
| "pipeline", | ||
| ) | ||
| organizations = Organizations(role) | ||
|
|
||
| output = generate_pipeline_inputs(pipeline, organizations, parameter_store) | ||
|
|
||
| return output | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.