APPLE, SAMSUNG, LG, XIAOMI, OTHERS PLAN TO ‘DOUBLE DOWN’ ON ARTIFICIAL INTELLIGENCE
By Kyle Wiggers — January 24, 2017 8:06 AM
AI and machine learning are increasingly becoming the focus of smartphone development efforts.
When it comes to smartphones, voice and app controls apparently aren’t enough anymore. According to DigitTimes, Apple, Samsung, Huawei, LG, and Xiaomi plan to “double down” on the development of artificial intelligence features on mobile in order to help “ramp up market shares” in the coming months.
Apple is developing an “enhanced Siri,” the report claims, reportedly via its acquisition of machine learning and AI startup Turi. DigiTimes didn’t go into specifics, but says the improved voice assistant will feature “improved voice recognition,” enhanced “reliability,” and “better contextual understanding” of common requests. It will debut on an upcoming iPhone, either this year’s iPhone 8 or next year’s model.
An improved Siri platform would no doubt benefit the company’s long-rumored home speaker. The hardware voice assistant, which reportedly features a microphone and speaker array like Google’s Home and Amazon’s Echo, is expected to be able to play music, control smart-home devices compatible with Apple’s HomeKit platform, and set reminders and alarms.
Samsung, for its part, is expected to launch the Galaxy S8 in the coming months alongside an AI assistant code-named Bixby. It’s powered by technology the company acquired in its purchase of Viv, an artificial intelligence startup founded by much of the same engineering team responsible for Apple’s Siri, and it’s reportedly capable of conversational language and third-party integrations that perform tasks, dictate messages, and respond to queries on demand. According to more recent reports, Bixby’s processing chops will power image analysis and object identification.
Huawei, the world’s third-largest smartphone maker by volume, is reportedly prepping an expansion of its third- and first-party AI efforts. It recently launched the Mate 9 in the United States with Amazon’s Alexa voice assistant pre-installed.
And it took the wraps off the Honor Magic in December, an AI-powered smartphone that intelligently recommends movies, launches driving modes, shows information like an incoming Uber’s license plate number, and reveals text message notifications only when it recognizes the owner’s likeness.
More: Samsung Galaxy S8 rumors and news leaks
LG is said to be in talks with both Google and Amazon to incorporate elements of the companies’ respective AI assistants into several of its upcoming products. It introduced Alexa-powered home appliances earlier this year, and its next flagship smartphone, the LG G6, will reportedly be the first beyond the Pixel to feature the Google Assistant.
Chinese electronics maker Xiaomi launched an AI discovery lab in November, and is said to have invested “heavily” in machine learning. “Our artificial intelligence technology will be everywhere,” co-founder and vice-president Wong Kong Kat told the South China Morning Post on Wednesday. “Even a chair can be smart enough to understand you and move to where you would be seated.”
Even startups are investing resources in AI research and development. Andy Rubin, Google’s former head of Android, is reportedly aiming to launch a smartphone in mid-2017 with unspecified machine-learning features. “In order for AI to blossom and fulfill consumer needs, it has to be about data,” he said in June.
Read more:http://www.digitaltrends.com/mobile/smartphone-ai-news/#ixzz4Y8CXjbcp
Follow us: @digitaltrends on Twitter | DigitalTrends on Facebook
This is a big pic item.
Some are saying Inference = FPGA if done best in the cloud.
If Inference is done on the device will it be the same and DRIVE eFPGA IP up up up in value?
Some will try it to find out.
-
Brand vendors to launch smartphones with AIapplications to ramp up sales in 2017[Members only]
When it comes to smartphones, voice and app controls apparently aren’t enough anymore
http://www.digitimes.com/tornado/v4/searchend.asp
Anybody who can grab this ?
Its nice that its a race with so many entrants.....that Eos with a built in LPSD must be pretty appealing...
-
AI Attracts Embedded Chip Veteran
Tools hotter than chips, says ex-Tensilica CEO
Rick Merritt
2/6/2017 00:00 AM EST
The AI field is so rich that Rowen has yet to determine whether he will mainly invest in startups, incubate a few, or launch one of his own in underserved or undiscovered areas he sees.
“My goal is to be smarter than the smart money,” said Rowen, who so far has just $10 million to spend but many years running companies focused on embedded systems, now generally called the Internet of Things.
Rowen sees some of the biggest opportunities in machine-learning tools. For example, he envisions tools to squeeze down the size of neural networks, generate code for them, and map them into fixed-point math routines or integrate them into a broad application framework. “Everyone has the same problems enabling rapid deployment,” he said.
The huge data sets and models used by web giants such as Google could be reduced by a factor of 500 to fit the needs of embedded systems, according to talks at a summit Rowen hosted at Cadence.
“There was lots of evidence of rapid progress adapting what was cloud-based technology by rethinking the algorithms and optimizing the networks,” he said, noting that the event drew nearly 200 people. “It was the biggest event ever held at the Cadence campus by a wide margin.”
Among AI market sectors, the autonomous vehicle segment is well served. Underserved areas include the human/machine interface, surveillance, and augmented personal devices, he believes. One area that Rowen does not see as a good target, ironically, is silicon.
“Embedded silicon for machine learning is probably more of a big-company game than a startup game because silicon generally is a big company game,” he said. “It’s highly likely that specialized inference engines will appear as accelerator subsystems like video codecs and audio DSPs in an SoC,” he added.
Cadence and Synopsys already offer such IP blocks, and other established chip companies have or are gearing up offerings, he noted. Silicon competition will play out in scaling performance and meeting specific market requirements
“Mainstream players are fully invested and producing some very good stuff; some companies will design their own inference engines, but they will have to compete with mainstream
suppliers who have a pretty good handle on the issues,” Rowen said.
Yup a ubiquitous NNLE...QUIK hasALL the BITS and pieces to make one....they can run the inference on the FPGA part. Make note that Brian Faith made a new point....eFPGA will move to HIGH PERFORMANCE l8er on. Thats very cool and is something new to put into an equation. I like my name better NNLE. ;-)
-
New
pg 2
Neural net apps demand significant hardware performance. (Image: Cognite)
Prospectors in the AI gold rush have their challenges. Training models can be “computationally painful,” but Amazon now offers GPU-based cloud services and others will follow, Rowen said.
ADVERTISING
The bigger problem is finding the large, structured data sets needed to train neural nets. Humongous data sets are among the core assets of web giants like Amazon, Facebook, and Google.
Embedded systems vendors will have to be innovative finding and structuring data sets. Some may create “multiple independent models and derive correct labeling from results the two agree on,” he said.
“It’s a data-driven technology that’s quite different from what other companies are used to, so people will have to go through a fundamental rethinking of the skills they need,” he added.
Pioneers also need to watch out so that they don’t get run over by fast-moving algorithms.
“This field is moving, at many times, the speed of other academic research areas because there is so much excitement that researchers are pouring into the space,” he said.
In another sign of the frenzy, Rowen recently found a list of 225 AI-related startups — just in the U.K. He has already posted a global list of nearly 200 that he has found. Many in the U.K. list are doing data analytics not necessarily related to AI, and only about a quarter are in embedded markets.
Overall, “the level of enthusiasm exceeds the level of experience by a large margin, so people have to get out there and get some battle scars to see how machine learning does or doesn’t apply to their problem,” he said. “It’s clear that it’s a big hammer that could drive a lot of nails, but it doesn’t solve everything.”
Whatever you think eFPGA IP is worth today...if inference can embed via e FPGA it is worth A LOT MORE>
-
New
lets look at this snip
specialized inference engines will appear as accelerator subsystems like video codecs and audio DSPs in an SoC,” he added.
On the right side of Moores law, with differentiation of the Soc for several yrs, QUIK will integrate this Inference engine...one of heterogeneous cores it will be.
No comments:
Post a Comment