If you ask anyone in the consumer electronics industry about the most important event of the year, they’ll point you to the Consumer Electronics Show (CES) without hesitation – a conference held every January in Las Vegas. That’s when producers from all over the world come together to show each other that they are setting the trends in innovation. The year 2025 is when artificial intelligence – already a staple at this conference for years – moves beyond the realm of concepts and prototypes and becomes an integral part of useful products.
This year, the loudest buzz was about NVIDIA, one of theworld’s largest manufacturers of processors and graphics cards. Their CEO, Jen-Hsun Huang, decided to strongly emphasize the presence of one of the biggest beneficiaries of the AI revolution. He introduced the new GeForce RTX 50 graphics cards, which many creators had been waiting for. However, Project DIGITS – a supercomputer the size of a Mac Mini, turned out to be the real sensation. It allows you to run language models with up to 200 billion parameters without massive infrastructure or cloud resources. The product costs $3k, so even small businesses or individual users can afford it. Until now, they were forced to either rent computing power in the cloud or invest in expensive GPU clusters, which constituted a barrier for many innovative projects.