Skip to content

FinOps : Cost Optimization techniques in AWS Cloud. Cloud Cost Optimization

Notifications You must be signed in to change notification settings

somrajroy/AWSCostOptimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 

Repository files navigation

Cloud Cost Optimization - AWS Cloud

About Cost Optimization techniques in AWS Cloud. Cloud Cost Optimization.

  • Cloud cost optimization is a combination of strategies, techniques, best practices, and tools that not only help reduce cloud costs but also maximize the business value of using the cloud.
  • Optimizing cloud costs isn't just about reducing costs; it's also about aligning costs with business goals. An increase in costs is not necessarily a problem if it's accompanied by an increase in revenue. One of the most important goals is to ensure that costs correlate with productive and profitable activities.
  • There are three fundamental drivers of cost with AWS which should be kept in mind while architecting solutions : compute, storage, and outbound data transfer.
  • As per AWS Well-Architected framework : Cost Optimization pillar includes the ability to run systems to deliver business value at the lowest price point.
  • Generic AWS Guidance on cost optimization
  • 5 Design Principles for cost optimization in the cloud.
  • AWS Cost Calculation example are is a good place to start. It is essential to measure the benefit/efficiency of a workload by business to justify the costs associated with it.
  • Increase Linux Compute (Amazon Linux prefrred in AWS or any other Linux distro) : Amazon Linux AMI (and Amazon Linux 2) is basically RHEL optimized to run on AWS. With Aamazon Linux, customers get images that are performant, secure, and supported by AWS. From a client perspective, that makes AWS Linux an attractive operating system. Most importantly it is around 2.5 times cheaper from windows counterpart and hence should always be given priority unless there is a good reason to choose something else. There’s no such thing as the perfect operating system, but many in the tech world would probably say that Linux comes close. With the ability to run on almost any machine, configured how client want it, and with a super-low TCO (just to name a few of its awesome attributes), a Linux based operating system should be go-to for the next time there is a challenge to tackle. This vital and popular OS will only continue to gain ground as cloud-native apps become more prevalent and companies look for new ways to build the future .
  • Leverage Automation in Cloud : Automation enhances system productivity, improves performance, and reduces costs. For example having a snapshot lifecycle management or automating AMI creations. AWS Systems Manager is a very useful service.
  • Upgrade instances to latest generation : They typically provide higher efficiency or better performance at a lower price. When AWS releases a new generation of instances, they tend to have improved performance and functionality compared to their predecessors. This means customers can either upgrade existing instances to the latest generation, or downsize existing instances with borderline utilization metrics in order to benefit from the same level of performance at lower cost.
  • (Optimize storage) Delete unattached EBS volumes :a good practice is to check “delete on termination” box when an instance is launched. This process can also be automated with AWS Lambda.
  • (Optimize storage) Delete obsolete snapshots. Snapshots can be delete when not needed anymore.
  • (Optmize storage) Leverage AWS S3 Glacier Instant Retrieval storage class : With S3 Glacier Instant Retrieval, customers can save up to 68% on storage costs compared to using the S3 Standard-Infrequent Access storage class. The company Snap is saving tens of millions of dollars leveraging S3 Glacier Instant Retrieval while delivering the same performance and powering new business opportunities, such as innovative app features and new hardware products.
  • Release unattached Elastic IP addresses : EIP's are free of change when attached to running service. Exceptions to the free of charge rule occur if customers remap an IP address more than 100 times a month, or if retain an unattached Elastic IP address after terminating the instances to which they were attached. AWS charges for such EIP's.
  • (Optimize storage) Move infrequently-accessed data to lower cost tiers : Store production files in S3 and move them between storage tiers based on activity, or dynamically using S3 Smart Tiering. Archive infrequently used data in S3 Glacier and back up long-term archived data in Glacier Deep Archive.
  • Schedule instances to ensure they run only during business hours or when needed. This is achieved automatically by using AWS Instance Scheduler or other tools.
  • Terminate zombie assets : The term “zombie assets” is most often used to describe any unused asset contributing to the cost of operating in the AWS Cloud. Sometimes it may be diffcult to find all assets and 3rd party tools like VMWare CloudHealth can help.
  • Tag AWS resources : Tagging helps to having a cost-conscious culture, establishing guardrails to meet financial targets, and gain greater business efficiencies. A good resource tagging policy can save costs. For example, resources with non-production tags can be de-provisioned or shut down in holidays or when not used. Tags helps to analyze and attribute expenditure. Tags makes it easier to accurately identify the usage and cost of systems, which then allows transparent attribution of IT costs to individual workload owners. This helps measure return on investment (ROI) and gives workload owners an opportunity to optimize their resources and reduce costs. Tagging resources increases the amount of data monitoring tools can obtain.
  • Right size instances : The purpose of rightsizing is to match instance sizes to their workloads. This ensures substantial cost savings. For any workload all resources should be scrutinized (i.e. right sized) before deployment. It is not a straight forward process and needs involvement of workload owners and business to arrive at a consensus. AWS Compute Optimizer is an helpful option provided & recommended by AWS.
  • Leverage the right pricing model : After selecting the correct size instance and creating elasticity through Auto Scaling or scheduling function, one can choose the correct pricing model. Reserved Instances are an excellent choice to save money and reduce the cost of the right workload. AWS provides Cost Explorer and Trusted Advisor to ensure that everyone can use the correct pricing model. AWS has EC2 pricing models to choose from on-demand instances, reserved instances, and spot instances.
  • Increase elasticity : Cloud enables to optimize costs to meet dynamic needs, turning resources off when they are no longer needed. This should be leveraged to the fullest. Customers can analyze EC2 instances with AWS Compute Optimizer and get data and receive reporting recommendations for right-sizing instances. AWS research shows that right-sizing workloads can save up to 36% for the average customer.
  • Use the right volume type of Amazon EBS : where performance requirements are lower, using Amazon EBS Throughput Optimized HDD (st1) storage typically costs half as much as the default General Purpose SSD (gp2) storage option. Details about every EBS volume type is available here..
  • Reduce data transfer costs : Data transfer from AWS resources creates significant expenditure. Consider using Amazon CloudFront CDN. Dynamic or static web content can usually be cached at Amazon CloudFront edge locations worldwide, and with this solution, customers can reduce the cost of data transfer out (DTO) to the public internet. Another good option is to reduce Public IP's in architecture. This is better for both cost and security perspective. There can be some rule/metric in place, for example, no more than 15% of cloud expenditure should be spent on data transfer.
  • Measure, monitor, and improve : As cloud environments are dynamic so measurements and monitoring for accurate visibility and continuous costs optimization is essential. One simple approach is defining and enforcing cost allocation tagging, define metrics and specific targets.
  • Deleting Idle Load Balancers and Optimizing Bandwidth Use : One can use AWS Trusted Advisor to identify load balancers with a low number of requests (a good rule of thumb is less than 100 requests in the last 7 days). Reduce costs by removing idle load balancers— overall data transfer costs with Cost Explorer can be tracked.
  • Charging Back Amazon Costs to Internal Users with Enterprise Billing Console : AWS Enterprise billing console is a new service that lets customers manage chargebacks—billing Amazon services to units in their organization, or to third parties. Within an organization, this allows customers to create accountability by billing each department or business unit, according to the cost of services they actually use. The Enterprise Billing Console lets customers allocate costs across accounts, and use the concept of billing groups. Billing groups allows to apply customized pricing plans to each department or business unit. Set up Cost and Usage (CUR) reports for each of the billing groups, and perform margin analysis to calculate savings for each group as a result of cost optimizations.
  • Dedicated Network Connections : For large enterises dedicated network connection services, such as AWS Direct Connect and Microsoft Azure ExpressRoute can also play a role in keeping data transfer charges—and overall cloud costs—down. In large cloud environments requiring large-scale interchange of data, these services can offer significant cost savings compared with transfers over the public Internet, as well as providing a faster and more secure connection.
  • AWS Cloudfront/Content Delivery Networks(CDN) : Network load on cloud servers can be reduced by offloading web traffic to a CDN. A CDN not only helps lower your data transfer costs, but also helps speed up delivery of data at the same time. It can help extend the geographical reach of web applications without having to distribute them across different cloud regions and/or different cloud platforms.
  • Identify and minimize Software License Costs : Software licensing is a major component of cloud operating costs. Manual license management is challenging, increasing the risk of paying for unused software licenses. Migrating to an open source equivalent product/software can save millions.
  • Common Mistakes that Drive Up AWS costs : Orphaned resources, Misconfigured storage tiers,Over-provisioned compute resources, not using spot instances where applicable & Incorrect use of pricing plans.
  • Some native AWS Cost Optimization Tools and Services :
    • CloudWatch : One of the keys to reducing cloud bills is to have visibility into services. CloudWatch is a AWS tool for collecting and tracking metrics, monitoring log files, creation of resource alarms, and setting of an automatic reaction to changes in AWS resources. Metrics can be set up as and when required.
    • AWS Cost Explorer.
    • AWS Trusted Advisor : Get real-time identification of potential areas for optimization. One of the five areas checked by Trusted Advisor is cost optimization.It provides automated responses to - EC2 reserved instance optimization, Low utilization of EC2 instances, Idle elastic load balancers, Underutilized EBS volumes, Unassociated elastic IP addresses & Idle DB instances on Amazon RDS.
    • AWS Budgets : Set custom budgets that trigger alerts when cost or usage exceed or are only forecasted to exceed a budgeted amount. Budgets can be set based on tags and accounts as well as resource types.
    • Amazon S3 analytics and Amazon S3 Storage Lens : Use Amazon S3 analytics – Storage Class Analysis for automated analysis and visualization of Amazon S3 storage patterns to help decide when to shift data to a different storage class. Amazon S3 Storage Lens delivers organization visibility into object storage usage, activity trends, and makes recommendations to improve cost-efficiency and apply best practices. This is the AWS video
    • Amazon S3 Intelligent-Tiering : Delivers automatic cost savings on S3 service by moving data between two access tiers: frequent access and infrequent access.
    • AWS Auto Scaling : Monitors applications and automatically adjusts resource capacity to maintain steady and predictable performance at the lowest possible cost.
    • AWS Cost and Usage Report : After set-up, everyone can receive hourly, daily or monthly reports that break out of costs by product or resource and by tags that was defined . These report files are delivered to Amazon S3 bucket.
    • AWS Compute Optimizer : Recommends optimal AWS resources for your workloads to reduce costs and improve performance by using machine learning. AWS Compute Optimizer analyzes resource utilization to identify AWS resources, such as Amazon EC2 instances, Amazon EBS volumes, and AWS Lambda functions, that might be under-provisioned or over-provisioned.
    • AWS Instance Scheduler : This is a simple service that enables customers to easily configure custom start and stop schedules for their Amazon EC2 and Amazon RDS instances.

Further References

Releases

No releases published

Packages

No packages published