AI news and technology updates AI tools for content creators My page - topic 1, topic 2, topic 3 Postbox Live

Samsung ultra-slim DRAM chips for AI in mobile devices

Samsung Unveils Ultra-Slim Chips

For Mobile Devices with Faster AI

Discover how Samsung’s ultra-slim LPDDR5X chips are revolutionising AI performance in mobile devices with faster processing and better heat management.

 

 

Chip Capacities and On-Device AI Processing

These new chips come in 12GB and 16GB capacities and are the thinnest 12-nanometer (nm) memory units available. Unlike traditional memory modules, these processors are optimised to handle memory tasks directly on the device. This feature allows smartphones to interact with storage components more efficiently, significantly speeding up AI-related tasks.

Moreover, the faster processing helps enhance user experience when running AI-powered applications. This development is particularly important for apps requiring real-time data handling, such as voice assistants or live translation tools.

Design Impact on Device Performance

Samsung’s ultra-slim LPDDR5X design also creates extra room inside mobile devices. This space can house a larger processor dedicated to AI functions. As a result, performance increases and airflow improves, both vital benefits, given the heat generated by advanced AI applications.

Furthermore, the compact structure aids in the development of sleeker devices without compromising internal functionality. Therefore, Samsung’s innovation not only boosts performance but also supports modern device aesthetics.

 

Samsung has started mass production of its ultra-compact LPDDR5X DRAM semiconductors. These fingernail-thin chips are designed to improve how mobile devices manage AI workloads and regulate heat.

These new chips come in 12GB and 16GB capacities and are the thinnest 12-nanometer (nm) memory units available. Unlike traditional memory modules, these processors are optimised to handle memory tasks directly on the device. This feature allows smartphones to interact with storage components more efficiently, significantly speeding up AI-related tasks.

Samsung’s ultra-slim LPDDR5X design also creates extra room inside mobile devices. This space can house a larger processor dedicated to AI functions. As a result, performance increases and airflow improves, both vital benefits, given the heat generated by advanced AI applications.

These chips are 9% slimmer than Samsung’s earlier 12nm DRAM units. More importantly, they offer approximately 21% better heat resistance. This makes them ideal for compact devices that demand efficient thermal management.

Samsung continues to embed AI across its product range. For instance, its Galaxy AI software delivers multiple generative AI features to mobile users. One notable function, Circle to Search, allows users to circle any object in an image and instantly search the web for that item.

The new DRAM modules will enhance memory performance for users leveraging Galaxy AI features in future Samsung phones. Their compact form factor also enables faster processing in smaller devices like smartwatches or IoT gadgets.

Samsung’s LPDDR5X DRAM sets a new benchmark for high-performance on-device AI. It delivers exceptional LPDDR speeds along with advanced thermal control in a tiny package,” said YongCheol Bae, executive vice president of Samsung’s memory product planning.

Looking ahead, Samsung plans to roll out 6-layer 24GB and 8-layer 32GB memory modules. These innovations aim to meet future demand in the low-power DRAM market. “We will continue to innovate in collaboration with our partners to offer solutions that shape the future of AI and mobile technology,” Bae added.

#SamsungAI, #LPDDR5X, #MobileAI, #SamsungChips, #GalaxyAI, #OnDeviceAI, #TechInnovation,

 


Discover more from Postbox Live

Subscribe to get the latest posts sent to your email.

error: Content is protected !!

Discover more from Postbox Live

Subscribe now to keep reading and get access to the full archive.

Continue reading