AWS’ Trainium2 chips for building LLMs are now generally available, with Trainium3 coming in late 2025

At its re:Invent conference, AWS today announced the general availably of its Trainium2 (T2) chips for training and deploying large language models (LLMs). These chips, which AWS first announced a year ago, will be four times as fast as their predecessors, with a single Trainium2-powered EC2 instance with 16 T2 chips providing up to 20.8 […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Leave a Reply