Friday, February 10, 2017


Please throw the next post away as a useless digression, but consider there might a one grain of value to it.

Pay attention to the details of Bixby on the GalACXY 8///

The Geek sites hold one detail that is so CRUCIAL ( if true)

There is a BIX button...you turn it on....you turn it off....


A dead give away that it is on the AP cause it KILLS the battery. You can have Bixby, but you turn him on...then ask something....and when done you turn it off..........

IF that the way it IS than IT really is bad..........ALexa on the line is always on, just say "alexa" , who wants to push a button to say Hey Bix.

So for a moment concider a Tier 1 wearable that has a DORMANT always on capability. No need to push a button.....WAAAAY better than the BIX Button.

Samsung needs a test vehicle to help them loooooooose that Bix button...

Based on the GEEK sites.......so IF the Wearable can be always on............ they can move through Samsung.......

Also I have such a keen focus on S Johnson's concept of the "Adjacent Possible" that honestly I understand people like G Gilder.....Futurist do NOT devine from God, they just read a TON and devine from the adjacent possible. They are like Carver Meade...they have enough background that they can listen to the Technology.

QUIK has ALL THE BITS AND PIECES TO BUILD A NNLE ( neural network learning engine) that one guy call an inference engine.

That they live to actually build this thing..

IF they do ;the adjacent possible says that whatever you think QUIK is worth......?

Multiply it...that why I have been here, am here now, will b here tom.

The adjacenet possible on the next Q?

" Are u kidding Jfieb." The street just don't matter to adjacent possible."

And for me it does resonate.

Thursday, February 9, 2017

  1. jfieb

    jfiebWell-Known Member

    New

    A repost from OCt of '15

    [​IMG]




    QUIK will add layers of intelligence on top of what we have now.
    Sensory shows us the benefit of keeping some intelligence local.


    ******************************************

    This idea is looming large now. To have the right bits and pieces on the bench is so important if you dont
    have it you just open the wallet and fork it over.

    It IS NOW for those working at the benches of AI, its the flux capacitor of AI.

    QUIK got itself on the right side of Moores law just in time...get established as a SoC and integrate a NNLE/Inference maker onto that Soc.

    If you create embedded intelligence it is worth more...higher margin. M & A value UP.
Try to say it again as it helps me to put things on one page to see it.

CH


"Reduced integration risk: With its sensor hub, QUIK has positioned itself on the
right side of Moore’s Law. Sensor hubs are emerging as a permanent element of
mobile/IoT architectures that should not only resist integration, but be able to
integrate other functions. 


snip on embedded AI.

specialized inference engines will appear as accelerator subsystems like video codecs and audio DSPs in an SoC,” he added.

XIlinx on inference


Once training is complete, the mode changes to “inference,” where the system applies its knowledge to situations in the real world. According to Xilinx, the big potential market in AI is inference,

Embedded inference will run in a inference engine, a heterogeneous core on a SoC, will that core be on FPGA /eFPGA...
the adjacent possible says that they will sure try it.... and IF it works the value of this IP


UP.

It has also been Dr Saxe adjacent possible since the fall of '15. BUT NOBODY saw the extent of the rise of AI/neural networks/inference/FPGA.

Is this factored in today?

Nada, QUIK has its plate full of important items and HAS NOT said anything. No hype, but it is out there.


For me one key take away is the power of reading and using S Johnsons lens of the adjacent possible as it sort of tells you way ahead of time


Open question the high performance articpro2...what is it aimed at...a competitor to XIlinx, ie data center, or do they extrapolate to high function on the device..say the embedded inference engines of tomorrow.

My vote is........

Wednesday, February 8, 2017


  1. APPLE, SAMSUNG, LG, XIAOMI, OTHERS PLAN TO ‘DOUBLE DOWN’ ON ARTIFICIAL INTELLIGENCE
    By Kyle Wiggers — January 24, 2017 8:06 AM


    AI and machine learning are increasingly becoming the focus of smartphone development efforts.

    When it comes to smartphones, voice and app controls apparently aren’t enough anymore. According to DigitTimes, Apple, Samsung, Huawei, LG, and Xiaomi plan to “double down” on the development of artificial intelligence features on mobile in order to help “ramp up market shares” in the coming months.

    Apple is developing an “enhanced Siri,” the report claims, reportedly via its acquisition of machine learning and AI startup Turi. DigiTimes didn’t go into specifics, but says the improved voice assistant will feature “improved voice recognition,” enhanced “reliability,” and “better contextual understanding” of common requests. It will debut on an upcoming iPhone, either this year’s iPhone 8 or next year’s model.


    An improved Siri platform would no doubt benefit the company’s long-rumored home speaker. The hardware voice assistant, which reportedly features a microphone and speaker array like Google’s Home and Amazon’s Echo, is expected to be able to play music, control smart-home devices compatible with Apple’s HomeKit platform, and set reminders and alarms.

    Samsung, for its part, is expected to launch the Galaxy S8 in the coming months alongside an AI assistant code-named Bixby. It’s powered by technology the company acquired in its purchase of Viv, an artificial intelligence startup founded by much of the same engineering team responsible for Apple’s Siri, and it’s reportedly capable of conversational language and third-party integrations that perform tasks, dictate messages, and respond to queries on demand. According to more recent reports, Bixby’s processing chops will power image analysis and object identification.

    Huawei, the world’s third-largest smartphone maker by volume, is reportedly prepping an expansion of its third- and first-party AI efforts. It recently launched the Mate 9 in the United States with Amazon’s Alexa voice assistant pre-installed.

    And it took the wraps off the Honor Magic in December, an AI-powered smartphone that intelligently recommends movies, launches driving modes, shows information like an incoming Uber’s license plate number, and reveals text message notifications only when it recognizes the owner’s likeness.

    More: Samsung Galaxy S8 rumors and news leaks

    LG is said to be in talks with both Google and Amazon to incorporate elements of the companies’ respective AI assistants into several of its upcoming products. It introduced Alexa-powered home appliances earlier this year, and its next flagship smartphone, the LG G6, will reportedly be the first beyond the Pixel to feature the Google Assistant.

    Chinese electronics maker Xiaomi launched an AI discovery lab in November, and is said to have invested “heavily” in machine learning. “Our artificial intelligence technology will be everywhere,” co-founder and vice-president Wong Kong Kat told the South China Morning Post on Wednesday. “Even a chair can be smart enough to understand you and move to where you would be seated.”

    Even startups are investing resources in AI research and development. Andy Rubin, Google’s former head of Android, is reportedly aiming to launch a smartphone in mid-2017 with unspecified machine-learning features. “In order for AI to blossom and fulfill consumer needs, it has to be about data,” he said in June.



    Read more:http://www.digitaltrends.com/mobile/smartphone-ai-news/#ixzz4Y8CXjbcp
    Follow us: @digitaltrends on Twitter | DigitalTrends on Facebook



    This is a big pic item.
    Some are saying Inference = FPGA if done best in the cloud.
    If Inference is done on the device will it be the same and DRIVE eFPGA IP up up up in value?

    Some will try it to find out. :)
  2. jfieb

    jfiebWell-Known Member


    Brand vendors to launch smartphones with AIapplications to ramp up sales in 2017[Members only]


    When it comes to smartphones, voice and app controls apparently aren’t enough anymore



    http://www.digitimes.com/tornado/v4/searchend.asp

    Anybody who can grab this ?

    Its nice that its a race with so many entrants.....that Eos with a built in LPSD must be pretty appealing...
  3. jfieb

    jfiebWell-Known Member



    AI Attracts Embedded Chip Veteran

    Tools hotter than chips, says ex-Tensilica CEO

    Rick Merritt

    2/6/2017 00:00 AM EST


    The AI field is so rich that Rowen has yet to determine whether he will mainly invest in startups, incubate a few, or launch one of his own in underserved or undiscovered areas he sees.

    “My goal is to be smarter than the smart money,” said Rowen, who so far has just $10 million to spend but many years running companies focused on embedded systems, now generally called the Internet of Things.


    Rowen sees some of the biggest opportunities in machine-learning tools. For example, he envisions tools to squeeze down the size of neural networks, generate code for them, and map them into fixed-point math routines or integrate them into a broad application framework. “Everyone has the same problems enabling rapid deployment,” he said.

    The huge data sets and models used by web giants such as Google could be reduced by a factor of 500 to fit the needs of embedded systems, according to talks at a summit Rowen hosted at Cadence.



    [​IMG]


    “There was lots of evidence of rapid progress adapting what was cloud-based technology by rethinking the algorithms and optimizing the networks,” he said, noting that the event drew nearly 200 people. “It was the biggest event ever held at the Cadence campus by a wide margin.”

    Among AI market sectors, the autonomous vehicle segment is well served. Underserved areas include the human/machine interface, surveillance, and augmented personal devices, he believes. One area that Rowen does not see as a good target, ironically, is silicon.

    “Embedded silicon for machine learning is probably more of a big-company game than a startup game because silicon generally is a big company game,” he said. “It’s highly likely that specialized inference engines will appear as accelerator subsystems like video codecs and audio DSPs in an SoC,” he added.

    Cadence and Synopsys already offer such IP blocks, and other established chip companies have or are gearing up offerings, he noted. Silicon competition will play out in scaling performance and meeting specific market requirements

    “Mainstream players are fully invested and producing some very good stuff; some companies will design their own inference engines, but they will have to compete with mainstream
    suppliers who have a pretty good handle on the issues,” Rowen said.



    Yup a ubiquitous NNLE...QUIK hasALL the BITS and pieces to make one....they can run the inference on the FPGA part. Make note that Brian Faith made a new point....eFPGA will move to HIGH PERFORMANCE l8er on. Thats very cool and is something new to put into an equation. I like my name better NNLE. ;-)

  4. jfieb

    jfiebWell-Known Member

    New

    pg 2


    [​IMG]
    Neural net apps demand significant hardware performance. (Image: Cognite)


    Prospectors in the AI gold rush have their challenges. Training models can be “computationally painful,” but Amazon now offers GPU-based cloud services and others will follow, Rowen said.

    ADVERTISING
    The bigger problem is finding the large, structured data sets needed to train neural nets. Humongous data sets are among the core assets of web giants like Amazon, Facebook, and Google.

    Embedded systems vendors will have to be innovative finding and structuring data sets. Some may create “multiple independent models and derive correct labeling from results the two agree on,” he said.

    “It’s a data-driven technology that’s quite different from what other companies are used to, so people will have to go through a fundamental rethinking of the skills they need,” he added.

    Pioneers also need to watch out so that they don’t get run over by fast-moving algorithms.

    “This field is moving, at many times, the speed of other academic research areas because there is so much excitement that researchers are pouring into the space,” he said.

    In another sign of the frenzy, Rowen recently found a list of 225 AI-related startups — just in the U.K. He has already posted a global list of nearly 200 that he has found. Many in the U.K. list are doing data analytics not necessarily related to AI, and only about a quarter are in embedded markets.

    Overall, “the level of enthusiasm exceeds the level of experience by a large margin, so people have to get out there and get some battle scars to see how machine learning does or doesn’t apply to their problem,” he said. “It’s clear that it’s a big hammer that could drive a lot of nails, but it doesn’t solve everything.”


    Whatever you think eFPGA IP is worth today...if inference can embed via e FPGA it is worth A LOT MORE>
  5. jfieb

    jfiebWell-Known Member

    New

    lets look at this snip

    specialized inference engines will appear as accelerator subsystems like video codecs and audio DSPs in an SoC,” he added.

    On the right side of Moores law, with differentiation of the Soc for several yrs, QUIK will integrate this Inference engine...one of heterogeneous cores it will be.