Sign In  |  Register  |  About San Rafael  |  Contact Us

San Rafael, CA
September 01, 2020 1:37pm
7-Day Forecast | Traffic
  • Search Hotels in San Rafael

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

AWS and Nvidia to partner on new AI supercomputing infrastructure

Amazon Web Services and Nvidia announced Tuesday that they are partnering on new AI supercomputing initiatives as part of their strategic collaboration.

Amazon Web Services (AWS) and Nvidia on Tuesday announced new initiatives in their strategic collaboration that will focus on adding supercomputing capabilities to the companies’ artificial intelligence (AI) infrastructure.

The announcement came at the AWS re:Invent conference and features several notable projects. One major new initiative is known as Project Ceiba, a supercomputer that will be integrated with several AWS services. It will give Nvidia access to a comprehensive set of AWS capabilities, including its Virtual Private Cloud encrypted networking and high-performance block storage. 

Project Ceiba will be used for research and development aimed at advancing AI for large language models (LLMs), graphics – including images, videos and 3D generation – in addition to simulations, digital biology, robotics, self-driving cars, Earth-2 climate prediction and more.

NVIDIA REVENUE SOARS DUE TO AI, BUT STOCK DIPS ON TRADE RESTRICTIONS

AWS and Nvidia will also partner in powering Nvidia DGX Cloud, an AI supercomputing service that gives enterprises access to multi-node supercomputing to train complex LLMs and generative AI models. It will be integrated with Nvidia AI Enterprise software and provide customers with direct access to Nvidia’s AI experts.

Amazon will become the first cloud provider to offer Nvidia’s GH200 Grace Hopper Superchips with multi-node NVLink technology with its Elastic Cloud Compute (EC2) platform. The Nvidia Superchips will allow Amazon EC2 to provide up to 20 terabytes of memory to power terabyte-scale workloads.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Nvidia will also integrate its NeMo Retriever microservice into AWS to help users boost their development of generative AI tools like chatbots and summarization tools that leverage accelerated semantic retrieval. 

Nvidia BioNeMo – which is available on Amazon SageMaker and will be incorporated on AWS on Nvidia DGX Cloud – helps pharmaceutical companies speed up the drug discovery process by simplifying and accelerating the training of AI models using their own data.

"Generative AI is transforming cloud workloads and putting accelerated computing at the foundation of diverse content generation," said Jensen Huang, founder and CEO of Nvidia. "Driven by a common mission to deliver cost-effective, state-of-the-art generative AI to every customer, Nvidia and AWS are collaborating across the entire computing stack, spanning AI infrastructure, acceleration libraries, foundation models, and generative AI services."

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"AWS and Nvidia have collaborated for more than 13 years, beginning with the world’s first GPU cloud instance," said Amazon Web Services CEO Adam Selipsky. "We continue to innovate with Nvidia to make AWS the best place to run GPUs, combining next-gen Nvidia Grace Hopper Superchips with AWS’s EFA powerful networking, EC2 UltraClusters’ hyper-scale clustering, and Nitro’s advanced virtualization capabilities."

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 SanRafael.com & California Media Partners, LLC. All rights reserved.