Lab complete!
Now that you have completed this lab, make sure to update your Well-Architected review if you have implemented these changes in your workload.
Click here to access the Well-Architected Tool
Create the On-Demand AWS Lambda function to get the AWS Organizations information, and extract the required parts from it then write to our bucket in Amazon S3.
Enter the following details:
Click Create function
Copy and paste the following code into the Function code section and change (account id) to your Management Account ID and (Region) to the Region you are deploying in:
#!/usr/bin/env python3
import argparse
import boto3
from botocore.exceptions import ClientError
from botocore.client import Config
import os
def list_accounts():
bucket = os.environ["BUCKET_NAME"] #Using environment variables below the Lambda will use your S3 bucket
sts_connection = boto3.client('sts')
acct_b = sts_connection.assume_role(
RoleArn="arn:aws:iam::(account id):role/OrganizationLambdaAccessRole",
RoleSessionName="cross_acct_lambda"
)
ACCESS_KEY = acct_b['Credentials']['AccessKeyId']
SECRET_KEY = acct_b['Credentials']['SecretAccessKey']
SESSION_TOKEN = acct_b['Credentials']['SessionToken']
# create service client using the assumed role credentials
client = boto3.client(
"organizations", region_name="us-east-1", #Using the Organizations client to get the data. This MUST be us-east-1 regardless of region you have the Lamda in
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
paginator = client.get_paginator("list_accounts") #Paginator for a large list of accounts
response_iterator = paginator.paginate()
with open('/tmp/org.csv', 'w') as f: # Saving in the temporay folder in the lambda
for response in response_iterator: # extracts the needed info
for account in response["Accounts"]:
aid = account["Id"]
name = account["Name"]
time = account["JoinedTimestamp"]
status = account["Status"]
line = "%s, %s, %s, %s\n" % (aid, name, time, status)
f.write(line)
print("respose gathered")
try:
s3 = boto3.client('s3', '(Region)',
config=Config(s3={'addressing_style': 'path'}))
s3.upload_file(
'/tmp/org.csv', bucket, "organisation-data/org.csv") #uploading the file with the data to s3
print("org data in s3")
except Exception as e:
print(e)
def lambda_handler(event, context):
list_accounts()
Edit Basic settings below:
Add environment variable:
Click Save
Click Test
The function will run, it will take a minute or two given the size of the Organizations files and processing required, then return success. Click Details and verify there is headroom in the configured resources and duration to allow any increases in Organizations file size over time:
We will setup a Amazon CloudWatch Event to periodically run the Lambda functions, this will update the Organizations and include any newly created accounts.
For the Event Source
Click Configure details
You have now created your lambda function to gather your organization data and place it into the S3 Bucket we made earlier. Using Cloudwatch this will now run every 7 days updating the data.
Now that you have completed this lab, make sure to update your Well-Architected review if you have implemented these changes in your workload.
Click here to access the Well-Architected Tool