cloudwatch-metrics-s3-bucket

Get CloudWatch metrics and migrate it on S3 Bucket

Fetch CloudWatch metrics through AWS CLI and upload it to S3 automatically

Usecase scenario : 

Customers would want CloudWatch metrics/logs, to analyze AWS resource useage and optimize accordingly. Ther can be 3 ways one can use CloudWatch metrics/logs : 


1. You want to get CloudWatch ‘metrics’ (CPUUtilization, DiskReadWrites, NetworkIn, NetworkOut, etc. and Custom metrics) and not

2. Use or monitor system ‘logs’/’logstreams’ to analyze Instance Usage.

3. Perform Cost/Resource Optimization, based on metrics.

There are two ways you can do this, First by AWS CLI and by Query API :

Using Amazon CLI to get resource based statistics per metric for a time-frame.

Step 1 : Setup AWS CLI on your EC2 instance

If you have already configured then you can skip this step o refer below link to configure the same.

https://docs.aws.amazon.com/cli/latest/userguide/installing.html

Step 2 : Create IAM user

I would suggest to create a new user with a custom policy attach to it, this will ensure the IAM user can only access specific AWS resources, in this scenario you’ll need only two permission :

"cloudwatch:GetMetricStatistics"
"cloudwatch:ListMetrics"

You can refer this link to create custom managed policy :

https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_managed-policies.html

 Step 3 : Fetch and store CloudWatch metrics

Type following command with your instance ID :

aws cloudwatch get-metric-statistics --metric-name CPUUtilization 
--start-time 2014-02-18T23:18:00 --end-time 2014-02-19T23:18:00 --period 360 --namespace AWS/EC2 --statistics Maximum --dimensions Name=InstanceId,Value=<your-instance-id> >> cloudwatchmetrics.txt

Examples for the variables:

<metricname>: CPUUtilization <starttime>: 2016-07-18T00:00:00 <endtime>: 2016-07-25T00:00:00 

Here you can change parameters as per requirement :

<metricname>   : Enter you desired metric name, I have used CPUUtilization by default.

--period : Mention intervals in seconds to fetch metrics accordingly (5mins = 3600 seconds)

<starttime> <endtime>  : Enter your prefered time and date (Difference should not be more than 14days)

Step 4 : Create script s3upload.sh to upload file cloudwatchmetrics.txt to S3 bucket

Copy paste content to s3upload.sh :

#! /bin/bash


#fetch and store metrics
aws cloudwatch get-metric-statistics --namespace AWS/EC2 --metric-name CPUUtilization --dimensions Name=InstanceId,Value=i-0XXXXX5 --statistics Maximum --start-time 2017-08-14T23:18:00 --end-time 2017-08-16T23:18:00 --period 360 >> cloudwatchmetrics.txt 

#upload file 

aws s3 cp cloudwatchmetrics.txt  s3:://<bucketname>/cloudwatchmetrics.txt

Goto /etc/crontab  and paste following command and save it:

55 22 * * * root  bash /<path-to-s3upload>/s3upload.sh

This will run script every nigth at 10:55 pm UTC, you can change it to your suggested time.

You can also fetch metrics using Query API :

To get the CPU utilization per hour for an EC2 instance for a 3-day range

  • Call GetMetricStatistics with the following parameters:
    • MetricName = CPUUtilization
    • Period = 3600
    • Statistics list includes Maximum
    • Dimensions (Name=InstanceId, Value="<your-instance-id>")
    • Namespace = AWS/EC2
    • StartTime = 2011-01-09T23:18:00
    • EndTime = 2011-01-12T23:18:00

 

That’s it, you have successfully configured and automated task to upload CloudWatch metrics logs to S3 bucket.

Assuming that you’ve gone through this blog post, I would like to take away the pain of repeating same steps. Here a ready made automated script while help you automate the whole process, please refer link below :

https://github.com/bhargavamin/automatic-cloudwatch-metrics.git

PS : I do keep writing scripts, so do keep an eye of my github account for updates aswell 🙂

Hope this helps!

Feel free to mail your queries at mail@bhargavamin.com, you can also comment below your views.

Thanks

-Bhargav