All Projects → hseera → aws-python-utilities

hseera / aws-python-utilities

Licence: Apache-2.0 license
Python utilities for AWS related tasks.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to aws-python-utilities

Watchdoginspector
Shows your current framerate (fps) in the status bar of your iOS app
Stars: ✭ 497 (+1361.76%)
Mutual labels:  performance-analysis, performance-monitoring
Pcm
Processor Counter Monitor
Stars: ✭ 1,240 (+3547.06%)
Mutual labels:  performance-analysis, performance-monitoring
Websu
Website Speed and Performance Optimization and monitoring
Stars: ✭ 37 (+8.82%)
Mutual labels:  performance-analysis, performance-monitoring
compile-time-perf
Measures high-level timing and memory usage metrics during compilation
Stars: ✭ 64 (+88.24%)
Mutual labels:  performance-analysis, performance-monitoring
Myperf4j
High performance Java APM. Powered by ASM. Try it. Test it. If you feel its better, use it.
Stars: ✭ 2,281 (+6608.82%)
Mutual labels:  performance-analysis, performance-monitoring
Apm Agent Dotnet
Elastic APM .NET Agent
Stars: ✭ 418 (+1129.41%)
Mutual labels:  performance-analysis, performance-monitoring
Mthawkeye
Profiling / Debugging assist tools for iOS. (Memory Leak, OOM, ANR, Hard Stalling, Network, OpenGL, Time Profile ...)
Stars: ✭ 1,119 (+3191.18%)
Mutual labels:  performance-analysis, performance-monitoring
Frontendwingman
Frontend Wingman, Learn frontend faster!
Stars: ✭ 315 (+826.47%)
Mutual labels:  performance-analysis, performance-monitoring
Caliper
Caliper is an instrumentation and performance profiling library
Stars: ✭ 162 (+376.47%)
Mutual labels:  performance-analysis, performance-monitoring
Nemetric
前端性能指标的监控,采集以及上报。用于测量第一个dom生成的时间(FP/FCP/LCP)、用户最早可操作时间(fid|tti)和组件的生命周期性能,,网络状况以及资源大小等等。向监控后台报告实际用户测量值。
Stars: ✭ 145 (+326.47%)
Mutual labels:  performance-analysis, performance-monitoring
performance-budget-plugin
Perfromance budget plugin for Webpack (https://webpack.js.org/)
Stars: ✭ 65 (+91.18%)
Mutual labels:  performance-analysis, performance-monitoring
ember-appmetrics
Ember library used to measure various metrics in your Ember app with ultra simple APIs.
Stars: ✭ 16 (-52.94%)
Mutual labels:  performance-analysis, performance-monitoring
Nodereactionagent
NodeReactionAgent is an Node.js asynchronous performance monitoring tool to be in conjunction with Nodereaction.com or nodereactionclient
Stars: ✭ 49 (+44.12%)
Mutual labels:  performance-analysis, performance-monitoring
Apm Agent Php
Elastic APM PHP Agent
Stars: ✭ 129 (+279.41%)
Mutual labels:  performance-analysis, performance-monitoring
Droidtelescope
DroidTelescope(DT),Android端App性能监控框架
Stars: ✭ 231 (+579.41%)
Mutual labels:  performance-analysis, performance-monitoring
PerfAvore
Rule based performance analysis and monitoring tool for dotnet written in F#.
Stars: ✭ 12 (-64.71%)
Mutual labels:  performance-analysis, performance-monitoring
aiohttp-client-cache
An async persistent cache for aiohttp requests
Stars: ✭ 63 (+85.29%)
Mutual labels:  dynamodb
redimo.go
Use the power of DynamoDB with the ease of the Redis API
Stars: ✭ 29 (-14.71%)
Mutual labels:  dynamodb
snippet-timekeeper
An android library to measure code execution time. No need to remove the measurement code, automatically becomes no-op in the release variants. Does not compromise with the code readability and comes with features that enhance the developer experience.
Stars: ✭ 70 (+105.88%)
Mutual labels:  performance-analysis
dynamo-node
DynamoDB mapper
Stars: ✭ 12 (-64.71%)
Mutual labels:  dynamodb

aws-python-utilities

Language Python Apache License

GitHub Last Commits GitHub Size Open GitHub Issue GitHub Open Issues GitHub Closed Issues

Python utilities for AWS. These utilities help save time with different facets (RCA/reporting/cost saving) of performance testing as part of devops or standalone. The readme page will continue to get updated as and when, I add new utility to the repo.


Utility Link Utility Link
1: Cloudwatch Metrics To Image 2: Update DynamoDB Insights
3: Update Dynamodb Capacity 4: Cloudwatch Dashboards
5: Compare Query And Scan 6: Compare Get And Batch Get Item
7: Bucket Size 8: Copy DynamoDB Table
9: Sample PartiQL DynamoDB Script 10: Spot Instance info
11: Stop Start EC2 12: Synthetic Monitoring
13: SQS Workbench 14: Lambda Region Price

1: Cloudwatch Metrics To Image

index

This simple utility allows you to generate Images for the Cloudwatch Metrics. There are times when you want to have an Image for reporting purposes (for example, Performance TSR). This utility reduces the effort required to generate the Image manually. It saves a lot of time when you have lots of Images to generate.

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

What things you need to execute the script

1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region

Execution

1: Make sure above prerequisite are met first.
2: Update the FileName in the script to where you want to save the Image.
3: Update the timezone setting in the script.
4: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
5: Replace the default json payload with the correct Image API json playload. Correct json payload can be copied from the Cloudwatch console. 
6: Now execute the python script by passing in the start and end time in epoch (ms) for which you want to generate the Image.

2: Update DynamoDB Insights

index

This simple utility allows you to enable or disable dynamodb contributor insights. Pass the table name in the script for which you want contributor insights enabled or disabled. Contributor Insights helps you identify which dynamodb partition is highly accessed. It is useful for DynamoDB table RCA.

Prerequisites

What things you need to execute the script

1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region

Execution

1: Make sure above prerequisite are first met.
2: Replace the default value for the variable "TABLE_TO_UPDATE" with your table name.
3: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
4: Now execute the python script.

3: Update Dynamodb Capacity

There might be cases when you end up having a lot of DynamoDB tables in your non-prod environment and they might be either set to Provisioned or On-Demand capacity. If they are not properly managed, cost ($$) of keeping these tables on Provisioned capacity can escalate pretty quickly. This simple python script goes through all the tables and if they are on provisioned capacity changes them to On-demand. If they are already on On-Demand capacity, it doesn't nothing.

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

What things you need to execute the script

1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region

Execution

Once above prequisites are setup, execute the python script

Improvements

  • Check when was the table last changed to On-Demand capacity. If it was less than 24 hours than reduce the Provisioned capacity else change it to On-Demand.
  • Improve on the get() function call. In future if AWS changes the json structure, this call if fail. Need to come up with a better approach.

4: Cloudwatch Dashboards

index This simple utility allows you to create or delete cloudwatch dashboards. Useful when you need to create multiple dashboards.

Prerequisites

What things you need to execute the script

1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region

Execution

1: Make sure above prerequisite are first met.
2: Replace the default payload for the variable "DASHBOARD_JSON" with your dashboard json payload. Best way is to take an exisiting dashboard payload from console, modify it and pass it into the script (if you are creating a dashboard).
3: Replace the default value for the variable "DASHBOARD_NAME" with your dashboard name.
4: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
5: Now execute the python script.

5: Compare Query And Scan

index

Compare DynamoDB Query vs Scan time.

6: Compare Get And Batch Get Item

index

Compare DynamoDB GetItem and BatchGetItem API calls.

7: Bucket Size

index

Get S3 Bucket Size.

8: Copy DynamoDB Table

This script takes a backup of a dynamodb table and copy's the data to a different dynamodb table. During the process, the data is first saved to S3 bucket.

9: Sample PartiQL DynamoDB Script

A sample python script example to execute SQL statement against DynamoDB.

10: Spot Instance info

Script to help extract spot instance information to answer questions such as:

  1. What is the current spot instance price in each region?
  2. What type of spot instances are available in a region & availability zone?
  3. What is the interruption rate for a spot instance?
  4. What OS is available for a spot instance in a region & availability zone?

index

index1

11: Stop Start EC2

Script gives the capability to stop & start ec2 instanced based on instanceid, instance type or platform.

Stop & Start All Instances

./stop_start_ec2.py stop id
./stop_start_ec2.py start id

Stop & Start By Instance type

./stop_start_ec2.py stop type <<instance type>>
./stop_start_ec2.py start type <<instance type>>

example

./stop_start_ec2.py stop type t2.micro
./stop_start_ec2.py start type t2.micro

Stop & Start By Platform

Pass "windows" if you want to stop or start Windows platform. Otherwise pass other.

./stop_start_ec2.py stop platform windows
./stop_start_ec2.py start platform windows

or

./stop_start_ec2.py stop platform other
./stop_start_ec2.py start platform other

Stop & Start By Platform And InstanceType

Pass "windows" if you want to stop or start Windows platform. Otherwise pass other. Also pass in what Instance type you want to stop or start.

./stop_start_ec2.py stop windows {InstanceType}
./stop_start_ec2.py start windows {InstanceType}

or

./stop_start_ec2.py stop other {InstanceType}
./stop_start_ec2.py start other {InstanceType}

example

./stop_start_ec2.py stop windows t2.micro
./stop_start_ec2.py start other t2.micro
./stop_start_ec2.py start windows t2.small
./stop_start_ec2.py stop other c4.2xlarge

Requirements

Python Modules

If you never used Amazon Web Services with Python before, you have to install two additional modules:

pip install boto3 botocore

or

pip3 install boto3 botocore

AWS Credentials

Save your AWS Credentials in your home/users folder:

Linux:

/home/[username]/.aws

Windows:

/Users/[username]/.aws

For more information about the content of the .aws folder check the AWS documentation: Configuration and Credential Files.

Instead of creating the .aws folder manually you can use the AWS Command Line Interface:

After you've installed the AWS CLI open the PowerShell (or the Command Prompt) in Windows. In UNIX-like systems open a Shell. Then run the following command:

aws configure

Enter

  • your AWS Access Key ID and
  • your AWS Secret Access Key.
  • As default region name enter your Availability Zone (AZ) and
  • use "json" as default output format

12: Synthetic Monitoring

Script gives you the capability to Start,Stop, Delete & Create canaries. You can also query to get script runtime version, which is required to create a canary.

Star & Stop All canaries

./synthetic_canary.py stop
./synthetic_canary.py start

Star & Stop a canary

./synthetic_canary.py stop/start <<canary name>>

example

./synthetic_canary.py stop workload
./synthetic_canary.py start workload

Delete a canary

./synthetic_canary.py delete {canary name}

example

./synthetic_canary.py delete workload

Script Runtime Version

Returns a list of Synthetics canary runtime versions and their dependencies.

./synthetic_canary.py runtime

Create a canary

./synthetic_canary.py create {canary name}

example

./synthetic_canary.py create workload

Requirements

Python Modules

If you never used Amazon Web Services with Python before, you have to install two additional modules:

pip install boto3 botocore

or

pip3 install boto3 botocore

AWS Credentials

Save your AWS Credentials in your home/users folder:

Linux:

/home/[username]/.aws

Windows:

/Users/[username]/.aws

For more information about the content of the .aws folder check the AWS documentation: Configuration and Credential Files.

Instead of creating the .aws folder manually you can use the AWS Command Line Interface:

After you've installed the AWS CLI open the PowerShell (or the Command Prompt) in Windows. In UNIX-like systems open a Shell. Then run the following command:

aws configure

Enter

  • your AWS Access Key ID and
  • your AWS Secret Access Key.
  • As default region name enter your Availability Zone (AZ) and
  • use "json" as default output format

13: SQS Workbench

GUI utility for Windows OS to send message to AWS SQS. Currently MVP status. More feature and improvements will be added to it.

Executable can be downloaded from here. Unzip the folder and run the sqs_workbench.exe file. index

14: Lambda Region Price

This python utility writes price for each lambda type in each region to csv. index

Requirements

Python Modules

If you never used Amazon Web Services with Python before, you have to install the following modules:

boto3 
botocore
PySimpleGUI

AWS Credentials

Save your AWS Credentials in your home/users folder:

Linux:

/home/[username]/.aws

Windows:

/Users/[username]/.aws

For more information about the content of the .aws folder check the AWS documentation: Configuration and Credential Files.

Instead of creating the .aws folder manually you can use the AWS Command Line Interface:

After you've installed the AWS CLI open the PowerShell (or the Command Prompt) in Windows. In UNIX-like systems open a Shell. Then run the following command:

aws configure

Enter

  • your AWS Access Key ID and
  • your AWS Secret Access Key.
  • As default region name enter your Availability Zone (AZ) and
  • use "json" as default output format

Contribute

If you would like to contribute to this project, please reachout to me. Issues and pull requests are welcomed too.

Author

Follow @harinderseera GitHub followers

License

This project is licensed under the Apache License - see the LICENSE file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].