snips
As companies like Facebook and Google continue to push neural networks onto smartphones, phone makers will start building hardware into these devices that can run neural networks with even greater speed.
Facebook
Ultimately, we were able to provide AI inference on some mobile phones at less than 1/20th of a second,
Apple
By slimming down the neural network, iPhones and iPads can identify faces and locations in photos, or understand changes in a user’s heart rate, without needing to rely on remote servers. Keeping these processes on the phone makes the features available anywhere, and also ensures data doesn’t need to be encrypted and sent over wireless networks.
In the cloud this stuff is done with FPGA coprocessors.
No comments:
Post a Comment