Home | Business | Apple's latest iPhones are packed with AI smarts

Apple's latest iPhones are packed with AI smarts

image
Apple’s phones come with new chip technology with a focus on helping the devices ...

 

 

 

 

 

Tom Simonite

 

 

 

 

AT A GLANCE the three new iPhones unveiled next to Apple’s glassy circular headquarters Wednesday look much like last year’s iPhone X. Inside, the devices’ computational guts got an invisible but more significant upgrade.

 

Apple’s phones come with new chip technology with a focus on helping the devices understand the world around them using artificial intelligence algorithms. The company says the improvements allow the new devices to offer slicker camera effects and augmented reality experiences.

 

For the first time, non-Apple developers will be allowed to run their own algorithms on Apple’s AI-specific hardware. That could enliven the iTunes app store with rich new experiences for socializing, creating art, or getting things done. Machine learning algorithms can help apps to understand and respond to what’s happening in photos and video, for example. Combined with Apple’s support for augmented reality, more AI oomph could help your iPhone transform the world around you.

 

All three iPhones announced Wednesday include a new chip called the A12, designed in-house by Apple. It has a unit called a neural engine, dedicated to running the neural network software behind recent advances in the ability of machines to understand speech and images.

 

Apple introduced the iPhone’s first neural engine last year inside the iPhone X, 8, and 8 Plus. It was the first major smartphone with a dedicated chip core for neural networks.

Image result for iPhone XS, XR, XS Max

 

The neural engine helped power the iPhone X’s Face ID facial-recognition unlock system and Animoji feature that transposes a person’s facial expressions onto cartoon animals. But the powerful new hardware wasn’t plugged into the Core ML feature Apple provides for developers who want to bake AI smarts into their own apps.

 

On Wednesday Apple announced that the neural engine is now significantly more powerful. Last year’s debut model could crank through 600 billion operations per second. The new version can work almost 10 times faster, reaching 5 trillion operations per second. Some of that speedup may come from using smaller transistors inside the A12, with features as small as 7 nanometers. Last year’s iPhone chip had 10-nanometer transistors. The A12 is a generation ahead of other smartphone chips, although mobile chip maker Qualcomm has said its own 7-nanometer technology will ship this year.

 

Apple says the new neural engine helps the new phones take better pictures. When a user presses the shutter button, the neural engine runs code that tries to quickly figure out the kind of scene being photographed, and to distinguish a person from the background. That information feeds into the iPhone camera’s portrait mode. On the more expensive XS models, portrait mode includes an impressive new function that makes it possible to change the depth of field on a photo after it has been taken, blurring or sharpening the background.

 

App developers can play with the power of Apple’s new neural engine through Core ML, a framework the company offers to help programmers deploy machine learning on Apple devices. The company says that this allows developers to run machine learning code nine times faster than on the iPhone X, while using a tenth of the energy.

 

Sports-centric app developer Nex Team, based in San Jose, got early access to some new Apple devices for testing. The company’s CEO David Lee says being able to tap the new version of the neural engine has made his basketball app HomeCourt much better.

Image result for iPhone XS, XR, XS Max

 

HomeCourt analyzes video from an iPhone or iPad to automatically track and log data including shots, misses, and a person’s location on court. On the iPhone X it would take a couple of seconds to present those stats to a user of the app, Lee says. “It’s now in real time, without delay,” he says. “When you’re on court, you don’t want to wait.”

 

Apple is not the only phone maker adding specialized chip modules for machine learning, and offering them to developers. Google included a specialized image processor that can run neural network software in its Pixel 2 phones released last October. It powers camera features that improve photos, and was made available to developers through a software update in January.

 

Artificial intelligence has become a key area of competition between Google, Apple, and rivals such as Amazon. In April, Apple hired Google’s top AI boss, John Giannandrea. He is now Apple’s chief of machine learning and AI strategy, and reports directly to Tim Cook. A search for “machine learning” on Apple’s jobs site Tuesday turned up more than 400 open positions worldwide, in areas ranging from Siri to health data to manufacturing./Wired

 

Subscribe to comments feed Comments (0 posted)

total: | displaying:

Post your comment

  • Bold
  • Italic
  • Underline
  • Quote

Please enter the code you see in the image:

Captcha
Share this article
Rate this article
5.00