This is a bash script made to work with nagios and to automaticly check Amazon AWS status
based on the RSS from http://status.aws.amazon.com/
I did find something similar in Ruby
I tried to keep it simple as possible…
S3 is a great place to keep logs data or even backups.
Monthly Basic cost:
S3 U$ 0.02 p/ GB (with Reduce Redundancy).
Glacier only U$ 0.01 p/ GB
This is a robust script that backups all instances that have a TAG, Backup=TRUE
it generates a AMI images, and snapshots of each Volume, also a nice e-mail with a HTML table nice format.
Today I tested the amazing Lambda and I was able to automaticly process images, generating thumbnails by just copying an image into a S3 bucket.
This script was written by me a while back (still has an old version of the CLI)
I found a minor bug, when using non integer values (decimal values) causes a problem,
bash does not handle it very well.
Anyone who feels like updating and fixing please do so.
This scripts counts how many active instances are inService of a AWS ElasticLoadBalancer (ELB).
The code is based on a simple API call via bash
To be able to execute the many API command line tools you will need to setup and configure the authentication correctly.
This will help you to do just that.
This can help if you are creating any kind of auto scale on Amazon AWS.
It is not a 100% automatic, as you will need to prior to running this have:
AWS Free Usage Tier (Per Month):
- 750 hours of Amazon EC2 Linux Micro Instance usage (613 MB of memory and 32-bit and 64-bit platform support) – enough hours to run continuously each month*
- 750 hours of Amazon EC2 Microsoft Windows Server Micro Instance usage (613 MB of memory and 32-bit and 64-bit platform support) – enough hours to run continuously each month*
- 750 hours of an Elastic Load Balancer plus 15 GB data processing*
- 30 GB of Amazon Elastic Block Storage, plus 2 million I/Os and 1 GB of snapshot storage*
- 5 GB of Amazon S3 standard storage, 20,000 Get Requests, and 2,000 Put Requests*
- 100 MB of storage, 5 units of write capacity, and 10 units of read capacity for Amazon DynamoDB.**
- 25 Amazon SimpleDB Machine Hours and 1 GB of Storage**
- 1,000 Amazon SWF workflow executions can be initiated for free. A total of 10,000 activity tasks, signals, timers and markers, and 30,000 workflow-days can also be used for free**
- 100,000 Requests of Amazon Simple Queue Service**
- 100,000 Requests, 100,000 HTTP notifications and 1,000 email notifications for Amazon Simple Notification Service**
- 10 Amazon Cloudwatch metrics, 10 alarms, and 1,000,000 API requests**
- 15 GB of bandwidth out aggregated across all AWS services*
- 750 hours of Amazon RDS for SQL Server Micro DB Instance usage (running SQL Server Express Edition in a single Availability Zone) – enough hours to run a DB Instance continuously each month
- 20 GB of database storage
- 10 million I/Os
- 20 GB of backup storage for your automated database backups and any user-initiated DB Snapshots