Amazon is making its own AI chips to rival Nvidia



Amazon is reportedly working on its own AI processor to lessen its reliance on NVIDIA for AI chips. The initiative will also add pressure on Nvidia, which dominates the AI chip business.

The e-commerce giant is reportedly getting ready to debut its AI chips and this comes as  Amazon is attempting to claim a fair share in the AI industry and consolidate its position in the AI race.

Amazon has already developed some in-house processors

According to the Financial Times, Amazon has already worked on several in-house processors custom-made for its data center workloads.

Most of the information covering this AI processor development is expected to be released by the firm at its Trainium chip line-up next month.

Reports are that OpenAI, which is backed by Microsoft, is already using the Annapurna Labs-developed chips. Annapurna Labs is the primary partner of Amazon that produces the AI processor chips as well as provides the e-commerce and cloud computing giant with access to Anthropic’s Claude foundational AI model.

The big tech has decided to push for less dependence on NVIDIA as it is the only firm producing the most powerful and dependent AI processors. Despite the NVIDIA GPUs being the best on the market, they are however in high demand and short supply making them expensive and a cost push to AI development.

The shortage of the top performing AI processors has left them being one of the most sought-after and expensive products in the world.

The Financial Times reported that for Amazon, the in-house development of AI chips is a strategy to reduce its dependence on NVIDIA for the product.

Amazon wants to reduce the cost of AI products from other firms

Amazon is however not new to custom chip production. The acquisition of Annapurna has enabled the firm to provide alternative options consistently in order to reduce the cost of using AMD and Intel’s products for traditional data center workloads.

Graviton processors, the ones made by Annapurna, work hand in hand with the custom AI processors called Trainium made by Amazon. Amazon unveiled Trainium 2 in November 2023, and it is designed to work with large language models.

As with the alternative data center processors, Annapurna is also on the front to reduce Amazon’s dependence on NVIDIA’s GPUs as it is the developer of choice.

Although the Financial Times report did not reveal much about the features of the chips, it, however, said the firm will use the event that will cover Trainium2 chips. Anthropic is using Trainium 2 but the chip has been in short supply despite it being launched in 2023.

According to Amazon, more than 500,000 AWS customers were using its Graviton chips in 2023.

Other tech giants like Google’s parent Alphabet and Facebook’s owner Meta have also gone into developing AI chips. However, some big tech like Apple still outsource to their rivals. Meta earlier this year launched its second generation Meta Training and Inference Accelerator (MTIA) and both of these reduce dependence on NVIDIA’s GPUs.

It is also being reported that the Microsoft-backed OpenAI is also looking at the possibility of developing its own chips. In the same vein, Google launched a tensor processing unit (TPU) AI chip, Trillium its latest offering early this November.

It is said that the Google TPU AI chips are four times faster in AI training performance and three times faster in inference than their predecessors.

Amazon in the meantime has extended its efforts beyond the development of hardware. In July, the tech firm announced that it is working on an AI chatbot called Metis. The chatbot is projected to operate with double the data parameters of OpenAI’s GPT-4.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *