Thursday, December 15, 2016

this snip

In 2017, we should see the beginnings of specialized hardware for “inference” – using those graphics-trained neural networks in a real time environment. Somewhat to our surprise, our checks show that Xilinx is emerging as a leader in this segment

I make a LOT of use of Steven Johnson's lens of his writing, with some emphasis on the Adjacent Possible as well as other concepts of his. I have been working on this stuff for many months now. Some commentary.

It has just come out of nowhere this past yr and moved swiftly into Steven Johnson's area if expertise.... Strictly from his work and what is at stake here, ie intelligence. The value
to get this on your bench is a NO LIMIT item. You just get it; you open the wallet and fork it over.

FPGA=Inference
DSP= training.

Facebook is doing inference right now ON THE DEVICE. How? ( Don't know)

THe adjacent possible and the urgency says they will try a SoC, with FGPAs on it, to run really fast inference ON the device.
IF it works, the talk they are excited to put on XLNX will also apply to QUIK IP.

Glo FO came to QUIK...the interest in FPGA IP on an IoT device may start out for bigger market TAM, but be prepared that it may shift to on device INFERENCE.
If this comes to be noticed the value of QUIK IP moves up in a very dramatic fashion. QUIK does NOT have to invest in the " Inference Engine" others may do it. If Eos sells enough QUIK has the bits and pieces to make the very best, Neural network learning/inference enigne that will exist for on device application.

Thnks for this item Danielle, its great reading.

Again I can only imagine what someone like Dr. Saxe is thinking over now.
QUIK will not take its eye off of the current focus and I realize not many want to hear of this adjacent possible until they sell something. It IS real and this emergence of FPGA= Inference
is a VERY big deal and the value of their IP is LOT more than we have considered.

No comments:

Post a Comment