设计工具
applications

AI in your smartphone? You’ll need faster storage.

Micron Technology | May 2018

How much intelligence can you hold in your hand? The answer is quite a lot.

2017 was the year artificial intelligence (AI) finally arrived in our smartphones. Not just in the cloud that smartphones connect to, but in the smartphones themselves.

Featuring on-device AI engines, these phones are designed to efficiently take in data from sensors, assimilate it, then store and compute it locally on the device. Performing tasks such as facial recognition, activity prediction and enhanced data encryption, these phones must balance demands for additional storage and computational power with size constraints, cost effectiveness and battery power. Because the AI chips going into these phones must be able to deliver speedy, accurate decisions from local data, they are reliant upon faster and more innovative system memory and storage.

Real-World Use Cases for AI Smartphones

New user experiences have already emerged from the enhanced image, auditory and voice processing capabilities offered by the latest smartphones. The next wave of experiences will be apps that support new use cases for AI in smartphones, including language processing, human activity prediction, and enhanced data encryption, among others.

As facial recognition for user authentication catches on, innovators will use on-device AI to help user authentication become more complex, yet more secure and more convenient. For instance, previously, a photograph could be used to defeat facial recognition. Now, using multiple 3D-depth sensors and infrared cameras, smartphone user authentication is both more secure and faster.

Natural language translation with on-device AI can enhance the speech recognition already present in most smartphones. Even further, local analysis and processing phone and chat conversations can help a smartphone become more responsive by using intent prediction, where a person’s behavior is predicted and a smart assistant recommends an action or a purchase. Future smartphone apps will surely move some buyer-assistance functionalities from cloud-based bots to smartphones, which are faster and more secure.

Integrating cloud-based AI with on-device AI expands the range of use cases even further. The University of California, Berkeley, for example, has an earthquake warning app called MyShake that uses the accelerometer sensor in your phone (it adjusts the screen when you turn the phone sideways) and GPS to measure how much shaking is happening locally. Combined with gathering reports from other MyShake users near you and performing consolidated analysis in the cloud, this application is meant to develop into both a personal seismometer and an early warning system.

Smartphones Becoming Learning Machines

Powering the shift to local, on-device AI are new specialized AI processing chips, which are technically more machine learning than AI. Machine learning is a subset of AI; it is technology that helps machines, over time, automatically learn for themselves without manual programming by responding to different types of data and eventually creating repeatable patterns. Neural network systems help these machine-learning applications sort through data, enabling computers to classify data more quickly. In 2017, engineers learned how to add a new AI component into their system-on-chips (SoCs) that improved the performance and efficiency of “smart” or AI-assistant tasks, and made it cost-, power- and size-effective.

AI is Accelerating the Challenge of Mobile Size and Power

Of a smartphone’s components, the CPU/GPU, display and memory consume the most power. Adding to that is now the power requirement for these new AI engines. Consumer demands for more pixel-rich displays and more memory to support them add to the load, so battery life continues to be a major concern for manufacturers.

Commercial 5G network services are expected to arrive in selected cities later this year. This future of ubiquitous, ultra-high-speed wireless connectivity, offering throughput up to 50 times faster than existing 4G networks, with latency improvements at least five times better than 4G, will unlock incredible possibilities for multimedia and video experiences. But mobile devices will require a more sophisticated memory subsystem to keep up with the speed and storage requirements without increasing power or footprint.

data accessing speed image

A Dedicated AI Engine Needs Processing

Local AI processing will add to the increased memory size and requirements of storage. More importantly, as more AI-specific applications are written, the need for faster storage performance will increase exponentially.

3D NAND is becoming the go-to storage solution for mobile devices that require high density and high capacity in a small footprint. The latest 64-layer versions of 3D NAND stack layers of data storage cells vertically to create storage devices with up to six times higher capacity than legacy 2D planar NAND technologies.

Additionally, the latest 3D NAND memory devices use the high-performance UFS storage interface, which enables read and write commands simultaneously with faster random read speeds than the previous generation e.MMC 5.1 interface. This combination of 3D NAND silicon and the speedy UFS interface enables more storage in a smaller die area, bringing significant cost savings, low power usage and high performance to power mobile devices equipped with AI.

A Shining Future

Smart-assistant features and functionality on smartphones must support fast—yet precise—decisions from the flows of data. Slow storage and memory means slow AI training performance, creating longer standby time and a fast-draining battery. The good news is that memory and storage innovations are delivering faster I/O operations and near-real-time AI calculation, feeding the growing data needs of these AI engines to create a powerful user experience.