Hot Topics in Cloud Computing Part 1
Cloud computing continues to evolve in the year 2020 with an even bigger market demanding. It changed the way of developing and deploying software. The public cloud providers will continue to offer the larger virtualizations of hardware, simplified containerization of software, and managed services-oriented architecture. We will keep seeing the following hot topics in cloud computing in the year 2020:
- DevOps
- Containerization
- Big Data Analytics
- Serverless Computing
- Artificial Intelligence and Machine Learning
- Cloud Security
Let’s use AWS re:Invent 2019 to discuss these hot topics in cloud computing. We will do the same recap on GCP Next 2020 in May.
DevOps
When the enterprise applications move to a public cloud, the big requirement is to maximize the rate of successful deployments with CI/CD pipelines from its on-premise environment into the public cloud. We can see AWS implement the guardrails and new products in the end-to-end release to achieve high availability deployment and zero deployment failures. For example:
- AWS CodeGuru preview: Amazon CodeGuru is a new machine learning service that helps you catch code issues faster and improve application performance. It has a CodeGuru Reviewer which analyzes pull requests for Java-based code, and CodeGuru Profiler which analyzes running apps. CodeGuru Profiler works with applications hosted on EC2, containerized applications running on ECS and EKS, as well as serverless applications running on AWS Fargate.
- Kubernetes is open-source software that allows you to deploy and manage containerized applications at scale. But Kubernetes deployments can add significant complexity to achieving the goal of DevOps. Please review the section of containerization on the new features released by AWS.
- AWS CloudFormation is one of the most widely used AWS tools, enabling infrastructure as code, deployment automation, repeatability, compliance, and standardization. It allows you to model your entire infrastructure and application resources with either a text file or programming language. Here are some new resources introduced in CloudFormation such as EventBridge, Quantum Ledger Database, Managed Blockchain, Lake Formation, Amplify, and Security Hub.
- AWS also launched a new type of infrastructure deployment called AWS Local Zones. It places compute, storage, database, and other select services closer to large population, industry, and IT centers, enabling you to deliver applications that require single-digit millisecond latency to end-users.
Containerization
Containers provide several benefits such as a consistent runtime environment, the ability to run virtually anywhere, the small size on disk, and the low overhead compared to a virtual machine. The following new features in Containers on AWS really simplify the container-orchestration system:
- Move AWS EKS (managed Kubernetes service) to AWS Fargate (serverless container computing). AWS EKS for Fargate makes it straightforward to run Kubernetes-based applications on AWS by removing the need to provision and manage infrastructure for pods.
- AWS Fargate spot is a new capability on AWS Fargate that can run interruption tolerant Amazon Elastic Container Service (Amazon ECS) tasks at up to a 70% discount off the Fargate price. It uses a similar concept of EC2 instance spot.
- AWS SageMaker operators for Kubernetes is a new capability that makes it easier for developers and data scientists using Kubernetes to train, tune, and deploy machine learning (ML) models in Amazon SageMaker. It implements machine learning (ML) workflows with Kubernetes and SageMaker. Building machine learning infrastructure on AWS EKS with Kubeflow also provides solid solutions to address authentication, training job monitoring, user profile, and resource quota management.
Big Data Analytics
The leading public cloud providers such as Amazon, Google, and Microsoft all offer comprehensive big data platforms to turn data into actionable insights. AWS provides more enhancements to its managed big data analytics services. Highlight the features of big data analytics from re:Invent 2019:
- AWS Lake Formation went generally available on Aug. 2019. AWS Lake Formation and AWS Glue consolidate and simplify the steps of ingesting, encrypting, cataloging, transforming, and provisioning data. AWS provides Data Lake Foundation on AWS solution. This reference architecture is automated by AWS CloudFormation templates that you can customize to meet your specific requirements.
- AWS data warehouse service Redshift got several upgrades. RA3 instances let you scale compute and storage separately. The ra3.16xlarge instances have 48 vCPUs, 384 GiB of memory, and up to 64 TB of storage. Users can also leverage a new managed storage model that uses SSD-based storage backed by S3. AQUA is a new distributed and hardware-accelerated cache that allows Redshift to run up to 10x faster than any other cloud data warehouse. Data Lake Export allows customers to unload data from a Redshift cluster to S3 in Apache Parquet format, an efficient open columnar storage format optimized for analytics. Federated Query lets customers analyze data across the Redshift cluster, S3 data lake, and in one or more AWS Relational Database Service (RDS) for PostgreSQL and AWS Aurora PostgreSQL databases.
- The UltraWarm is in the preview for the AWS Elasticsearch service now. UltraWarm is a performance-optimized warm storage tier that lets you store and interactively analyze your data using Elasticsearch and Kibana while reducing your cost per GB by up to 90% over existing AWS Elasticsearch Service hot storage options.
- AWS released Managed Apache Cassandra Service. MCS is a scalable, highly available, and managed Apache Cassandra-compatible database service and implements the 3.11 CQL API.
- AWS RDS Proxy is available in preview for RDS MySQL and Aurora MySQL now. It sits between your application and its database to pool and share established database connections, improving database efficiency and application scalability. This fully managed, highly available database proxy aims to relieve the challenge of database connection management for serverless applications.
Serverless Computing
Serverless computing is a cloud-computing execution model that handles scaling, capacity planning and maintenance operations for the developers. We will continue to see advanced serverless computing this year. Here is the highlight of the serverless service specific features that AWS has been announced:
- AWS released a handful of features in AWS Lambda such as works with VPC networks to address several of the limitations of the previous networking model with VPCs and improve scale and performance. Provisioned Concurrency keeps functions initialized and hyper-ready to respond in double-digit milliseconds.
- AWS Step Functions has expanded its integration with SageMaker to simplify machine learning workflows, and added a new integration with Amazon EMR, making it faster to build and easier to monitor EMR big data processing workflows. The new Express Workflows option is designed for high-volume, short-duration use cases with the at-least-once execution model for high performance and low cost.
- AWS EventBridge schema registry and discovery is in preview now. EventBridge is a serverless event bus that makes it easy to connect applications together using data from your own applications, integrated Software-as-a-Service (SaaS) applications, and AWS services. Using EventBridge with other AWS serverless services like AWS Lambda, Amazon API Gateway, and AWS Step Functions to build modern applications with increased agility and lower total cost of ownership.
Artificial Intelligence and Machine Learning
Machine learning development is complex and expensive. This type of technology is still young with the challenges of building and training machine learning models. There is a lack of easy-use integrated tools for the entire machine learning workflow. So we will continue to see heavy development in this area. AWS launched several tools under its SageMaker and Augmented AI to release the challenge of ML development:
- SageMaker removes the heavy lifting from each step of the machine learning process to make it easier to develop high-quality models. The newly launched tools are
- Amazon SageMaker Studio is the first fully integrated development environment (IDE) for machine learning (ML). You can quickly upload data, create new notebooks, train and tune models, move back and forth between steps to adjust experiments, compare results, and deploy models to production all in one place, making you much more productive.
- Amazon SageMaker Experiments help you organize, track, compare and evaluate machine learning (ML) experiments and model versions.
- Amazon SageMaker Debugger automatically identifies complex issues developing in machine learning (ML) training jobs. It makes the training process more transparent by automatically capturing real-time metrics during training such as training and validation, confusion matrices, and learning gradients to help improve model accuracy.
- Amazon SageMaker Model Monitor automatically monitors machine learning (ML) models in production and alerts you when data quality issues appear.
- Amazon SageMaker Autopilot automatically creates high-quality ML models with full control and visibility. It inspects raw data, applies feature processors, picks the best set of algorithms, trains, and tunes multiple models, tracks their performance, and then ranks the models based on performance.
- Amazon Augmented AI (A2I) makes it easy to build and manage human reviews for ML applications through built-in workflows for common ML use cases, such as content moderation (with Amazon Rekognition) and text extraction (with Amazon Textract). You can also create your own workflows by triggering business conditions in AWS Lambda or directly writing the code in your client application.
Cloud Security
After the Capital One breach, should big business fear the public cloud? The answer is No. But the enterprise should take a deep look at the shared responsibility model of “security in the cloud.” The public cloud providers will continue to build security capabilities that go beyond past identity and access management. Here is the highlight on newly launched AWS security features:
- AWS IAM Access Analyzer continuously monitors and analyzes resource policies to help administrators and security teams protect their resources from unintended access. It uses a form of mathematical analysis called automated reasoning to analyze access control policies attached to resources and determines which resources can be accessed publicly or from other accounts. It offers comprehensive, detailed findings through the AWS IAM, Amazon S3, and AWS Security Hub consoles and also through its APIs.
- Amazon Detective can analyze trillions log data from your AWS resources and use machine learning, statistical analysis, and graph theory to build a linked set of data that enables you to easily conduct faster and more efficient security investigations.
- AWS Nitro Enclaves makes it easy for customers to create isolated compute environments within the Amazon Elastic Compute Cloud (EC2) instances to further protect their highly sensitive workloads. With Nitro, each host in the core computes platform is built with trusted computers that simulate the outside world and surround an untrusted CPU and memory computer that runs workloads. Those trusted Nitro computers appear to the customer workload computer as I/O devices that are accessible across the PCIe bus.