If it seems like artificial intelligence (AI) is everywhere lately, it is, but AI has been powering our everyday experiences for some time. When you ask Alexa to play a song, when you stride out—sandwich in hand—from an Amazon Just Walk Out-equipped store, or when you press play on a movie recommendation from Amazon Prime, you are tapping into AI. More specifically, you are interacting with machine learning (ML) models.
You have likely witnessed all the focus and attention on generative AI in recent weeks. Generative AI is a subset of machine learning powered by ultra-large ML models, including large language models (LLMs) and multi-modal models (e.g., text, images, video, and audio). Applications like ChatGPT and Stable Diffusion have captured everyone’s attention and imagination, and all that excitement is for good reason. Generative AI is poised to have a profound impact across industries, from health care and life sciences, to media and entertainment, education, financial services, and more.
At Amazon, we believe AI and ML are among the most transformational technologies of our time, capable of tackling some of humanity’s most challenging problems. That is why, for the last 20 years, Amazon has invested heavily in the development of AI and ML, infusing these capabilities into every business unit.
Today, our ML models are working on behalf of hundreds of millions of Amazon customers around the world, and providing tangible value by removing friction from supply chains, personalising digital experiences, and making goods and services more accessible and affordable. More than 100,000 customers are using dozens of AWS AI and ML services offered in the cloud.
Amazon has much more to come, and it’s coming fast. What follows are four innovations just announced to support generative AI application:
Amazon Bedrock: Easily build generative AI applications
Amazon Bedrock is a new service for building and scaling generative AI applications, which are applications that can generate text, images, audio, and synthetic data in response to prompts. Amazon Bedrock gives customers easy access to foundation models (FMs)—those ultra-large ML models that generative AI relies on from the top AI start-up model providers, including AI21, Anthropic, and Stability AI, and exclusive access to the Titan family of foundation models developed by AWS. No single model does everything. Amazon Bedrock opens up an array of foundation models from leading providers, so AWS customers have flexibility and choice to use the best models for their specific needs.
The upshot: Amazon Bedrock is the easiest way to build and scale generative AI applications with FMs.
General availability of Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips: Lowering the cost to run generative AI workloads
Ultra-large ML models require massive compute to run them. AWS Inferentia chips offer the most energy efficiency and the lowest cost for running demanding generative AI inference workloads (like running models and responding to queries in production) at scale on AWS.
The upshot: Lower costs and energy consumption make generative AI more accessible to a wider variety of customers.
New Trn1n instances, powered by AWS Trainium chips: Custom silicon to train models faster
Generative AI models need to be trained so they offer the right answer, image, insight, or other focus the model is tackling. New Trn1n instances (the server resource where the compute happens, and in this case, runs on AWS’s custom Trainium chips) offer massive networking capability, which is key for training these models quickly and in a cost-efficient manner.
The upshot: Developers will be able to train models more quickly and at less expense, ultimately leading to more services powered by generative AI models.
Free access to Amazon CodeWhisperer for individual developers: Real-time coding assistance
Imagine being a software developer with an AI-powered coding companion, making your coding faster and easier. Amazon CodeWhisperer does just that. It uses generative AI under the hood to provide code suggestions in real time, based on a user’s comments and their prior code. Individual developers can access Amazon CodeWhisperer for free, without any usage limits (paid tiers are also available for professional use with features like added enterprise security and administrative capabilities).
The upshot: What are you waiting for? It’s free! Get coding.
These offerings are only the beginning. One of Amazon’s core principles is a commitment to long-term thinking. We believe these are the early stages of a technological revolution that will continue for decades to come. That’s why we will continue to shape the future of this technology—guided by the needs of our customers—toward the most promising, beneficial, and responsible AI applications.
This content is hosted by a third party (www.youtube.com).
To view the content, you need to consent to cookies by selecting Accept all in the popup banner. Or you can go to the site footer, select Cookie Preferences, and then select On under Functional Cookies, Performance Cookies and Advertising Cookies.
To dive deeper, and get all the technical detail on this exciting news, be sure to read AWS VP of database, analytics and machine learning Swami Sivasubramanian’s blog.