New Data-Intensive Cloud Storage from AWS
New Data-Intensive Cloud Storage from AWS by UCStrategies Staff
It was announced last week that Amazon Web Services had improved storage capabilities that could handle data intensive applications. The new storage designs are dubbed High Storage Eight Extra Large, and compliment applications with large amounts of data. Warehousing, Hadoop workloads and log processing are included in this.
Amazon stated: “We know that these applications can generate or consume tremendous amounts of data and that you want to be able to run them on EC2 [Amazon Elastic Compute 2].”
The warehousing service which was unveiled by AWS at its re: Invent conference in Las Vegas last month, called Amazon Redshift, and will run the new storage instance designs.
Thirty-five EC2 Compute Units, (ECUs) of compute capacity, 117 GB of RAM, and 48 TB of storage over 24 hard disk drives are examples of the storage instances are given to users, according to AWS. More than 2.4 GBps of sequential I/O performance can be delivered by the instance.
The vice president of Amazon EC2. Peter De Santis, stated: “As customers move every imaginable workload to AWS, we continue to provide them with additional instance families to meet the requirements of their applications.”
At present, Amazon's High Storage Eight Extra instances are accessible in the AWS U.S. East (Northern Virginia) Region, and will be deployed to other AWS areas in the upcoming months.
A $4.60 on-demand pricing in the U.S. East Region is applicable. Both one- and three-year reserved instances are available for purchase (for light, medium or heavy workloads).
Furthermore, it was announced on Friday that AWS is making its Data Pipeline available; this is an automated data flow system to assist in the organization and routing of data for businesses from various depositories. The system was unveiled at the re: Invent conference. (CY) Link