ttp://www.datacenterdynamics.com/c...the-data-center-must-change/97475.fullarticle
Nice read by AMD on their vision...
http://www.datacenterdynamics.com/c...the-data-center-must-change/97475.fullarticle
“The second thing is, like I said, we believe that for the next generation machine intelligence, it’s going to be heterogeneous processors. So it’ll be programmable CPUs, programmable GPUs, FPGAs, and special function stuff. We believe that the interconnect is a very important thing, that how these things all work with each other is going to be super important.”
We can see if QUIK's eFPGA can be a part of the herogeneous processors of tomorrow-on the edge-=not the data center.
For the casual reader.
A few visonaries like Todd Mozer, like Dr Saxe see some layer of intelligence on the device.
One BIG dog spoke clearly that they are doing inference on the device; and hopes to do a LOT more of that.
QUIK does not speak of this as part of the current activity....but we can see,
THe Sensory algos are based on neural network algos, so QUIK has had real good experience on them.
It will be fun to read who works with the eFPGAs.
A blast from the past..
Nov '15
Dr Saxe... on the Edge. Had it in focus for some time now.
http://embedded-computing.com/26534-five-minutes-with-timothy-saxe-sr-vpcto-quicklogic/
If you enjoy this topic than this is a great read...
https://www.wired.com/2017/02/ai-learn-like-humans-little-uncertainty/
AI Is About to Learn More Like Humans—with a Little Uncertainty
I can't grab the text to read it at Wired.
One item of commentary.
It goes like this. No one really saw the rise of AI moving forecfully to the front in '16.
So there is training: DSPs in the cloud ( Nivida)
There is inference...Some are saying very loudly FPGAs do it best.
There is Facebook saying it wants to do inference ON the device.
The wired article is great to show how fast this is moving and that there may be new ways
of Bayesian probabilities, which resonates with me.
But when it comes to the value of IP as it relates to intelligence.
The valuation here is very different.
If you need a bit/piece to try on the workbench
you just open the wallet and you buy it.
Usual metrics of valuation here; They do NOT matter( so much)
So no ONE so far has made the jump in any writing of
eFPGA perhaps enabling Inference on the device.
One of the Tier one semi who licenses eFPGA will try it.
IF IT works, what ever you think the IP is worth; its worth more.
This is just a WILD card for now.
It is Adjacent Possible now and a WILD card that is
higher in value than many you will see.
Thats what the Adjacent possible has to say.
One key thing as I track this forward.
....IF anyone can build a ubiquitous NNLE ( neural network learning engine ) that can reside on a mobile device -it is Dr Saxe, they have ALL the bits and pieces needed to do so. The latest tidbit of info on the use of FPGA as a very good place to run inference
is new this yr....someone just has to try it on their benches, I mean running the inference on a FPGA designed for low power so it can be done on the device. With the IP $$, I am happy that Dr Saxe will get to make that Eos S4.
It IS largely unseen and unknown.
A card so high and wild
no reason to deal another.( paraphrase of a L Cohen line ( who's gone now)).
Will track along and this can be skipped for those focused on wearables, smartphones, ie where the puck will be in '17.
Last edited: Today at 8:34 AMJohn likes this.
Like any dealer he was watching for the card
that is so high and wild
he'll never need to deal another
He was just some Joseph looking for a manger
He was just some Joseph looking for a manger
http://www.azlyrics.com/lyrics/leonardcohen/strangersong.html
Inference on the device is just such a card.
so high and wild
he'll never need to deal another .
WHo says so?
The Adjacent possible, and the AP knowns it all pretty much.
No don't expect anyone to write about it now. And if it comes to pass it will be too late to write about it.
So just hold what you have and see what happens.
That is what I will do.
What a great tool S Johnson has given me and anyone who cares to use his lens. You just don't need anything else. . I won't ever forget it the rest of the way down the road.
no ONE so far has made the jump in any writing of
eFPGA perhaps enabling Inference on the device.
One of the Tier one semi who licenses eFPGA will try it.
If you enjoy this topic than this is a great read...
https://www.wired.com/2017/02/ai-learn-like-humans-little-uncertainty/
AI Is About to Learn More Like Humans—with a Little Uncertainty
I can't grab the text to read it at Wired.
One item of commentary.
It goes like this. No one really saw the rise of AI moving forecfully to the front in '16.
So there is training; DSPs in the cloud ( Nivida)
There is inference...Some are saying very loudly FPGAs do it best.
There is Facebook saying it wants to do inference ON the device.
The wired article is great to show how fast this is moving and that there may be new ways
of Bayesian probabilities, which resonates with me.
But when it comes to the value of IP as it relates to intelligence.
The valuation here is very different.
If you need a bit/piece to try on the workbench
you just open the wallet and you buy it.
Usual metrics of valuation here; They do NOT matter( so much)
So no ONE so far has made the jump in any writing of
eFPGA perhaps enabling Inference on the device.
One of the Tier one semi who licenses eFPGA will try it.
IF IT works, what ever you think the IP is worth; its worth more.
This is just a WILD card for now.
It is Adjacent Possible now and a WILD card that is
higher in value than many you will see.
Thats what the Adjacent possible has to say.
One key thing as I track this forward.
It IS largely unseen and unknown.
A card so high and wild
no reason to deal another.( paraphrase of a L Cohen line ( who's gone now))