Friday, January 19, 2018

consider that intelligence on the edge is discussed in so many places.  The original AI crystallization ocurred when Goog beat the best GO players.
What was neat is that there was almost no one who saw it coming.


'CNNs - convolutional neural networks should be the way it happens on the edge.

Why?  small footprint.

intelligence of the edge WILL be ubiquitous.  If its ubiquitous it will get hardened...

Some proof of that?

Yes, here is a snip of text...


very,very important...

. Most notably Apple’s neural engine and HiSilicon’s neural processing unit lead the pack with already shipping silicon. These new IP blocks are hardware accelerators for convoluted neural network inferencing. As opposed to what we call “deep learning” which is the training aspect of CNNs, inferencing is the execution of already trained models.

Use-cases such as image classification are very latency and performance sensitive so the industry has evolved towards edge device inferencing, meaning a device such as a smartphone locally has a trained neural network model and does the inferencing and classification locally on the device without having to upload images or other content to the cloud. This is vastly improves latency and makes use-cases such as instantaneous camera scene recognition viable (As used on Huawei’s Mate 10 camera).


As with other IP blocks (image and video encoder/decoders) which accelerate and offload workloads from more general purpose blocks such as the CPU and GPU in a much more specialized and thus faster and more efficient way, neural network inference can also be offloaded. This is what we call neural network accelerators such as CEVA’s NeuPro.



If you are still with the technology here....make the jump, you can do it.
Nobody has been talking about running inference on the edge on eFPGA.


They run inference in the cloud on fpga, why not on the edge?

One reason is that the CEVA's of the world DO NOT have it on their bench to even try it.

Someone WILL try it.  Maybe QUIK is having some fun on their bench as that is what they have.
Is it significant that QUIK has NOT said one word of this?

Did they say anything that they were making the FFE?
They can keep a secret very well



Inference on the edge puts QUIK IP at a much higher valuation that we think of.  It is a multiple of the current market value.
I have my subjective multiplier, but its my own and won't help someone else. :).


QUIK can you run inference on eFPGA on the S4?

Thanks in advance for the consideration. jfieb.
Current issue of time Mag.


Baidu's Robin Li is Helping China Win the 21st Century



when he was interviewed for school in the USA.....

t in 1992, when the Baidu CEO was a tongue-tied Chinese student applying for a computer-graphics graduate program in the U.S. The interviewing professor asked him, “Do you have computers in China?” It left the young man stunned. 



I don't blame him for being stunned, who would ever forget that. How much have we really changed?


one short snip...




“We want to use this global stage to show international partners how we innovate at China speed, and how we enable our partners to create very innovative products,” Jing says.


on voice


Our vision is that humans can interact with all devices using human language,” says Li. “The difference between humans and animals is that humans can use tools. 







For the casual reader?

Digressive reading has always been important.  'In '18 that will be all over the place and voice moves from one market into another."
For me it is FAaaR more rewarding than sitting around watching some price of a stock bounce around and take cues from that.
I learn a TON and may also turn up something as good as QUIK?  


There are reasons that voice sill move farther and faster in China- the Baidu phrase "China Speed"- is real.


Things of real importance for the QUIK faithful?

We can make a short list of the BIG ecosystems and ideally we want QUIK to be on a device or two for ALL of them.

1. Alexa- yes, we have that one.

2. Goog-not yet.

3.  Wechat/ten cent- yes we have that one.

4. Baidu- not yet,


Now Goog and Baidu have one thing in common, they want 3rd parties to make the multitude of devices that have always listening in them.
SO I will track along and highlight when we have a 3rd party devices in one of these HUGE platforms



consider that our growing number of QUIK based devices in China lend a legitimacy for the other 3rd party device makers.
It will be easier to make a decision, ie ALL these others, its just a good decision?

So expect that we will be on Baidu devices one day yet this yr.
From a WeChat slide deck on this thread...


[​IMG]

So though they are hold to talk from inception. They may well loose the button one of these days and go
always on....Key point?

Input into devices has NOT been good for China up till now. Voice should resonate even more in China,move farther and faster. How many in the West would understand this simple key point alone?








In the USA the thought process is too focused and perhaps biased on the old thinking processes,
that silicon valley leads in all things high tech. 





Baidu phrase I learned this week?

"China speed".

What we have seen so far from just China, will be underappreciated by the Western bias in understanding.
The slots we have may just be the tip of something much bigger. Voice will move farther and faster in Asia, ........

QUIK did you 2x the pipe at CES?

what have I been reflecting on?

Smartphones- As the China speed always listening devices take root they start to make the Smartphone look a little dated? Not so up to date, maybe antiquated from this point of view.
The adjacent possible- it is easy to look across the bench and see a always listening hearable-
" Hey why NOT give it a try on a smartphone." Just to see what you get? THe MORE pervasive always listening devices become it is far more likely that a Flagship WILL one day roll with it;

So I will track along.


The first might be someone who has a vision of Voice as a platform for voice on all things pervasively...there is the NTT DOCOMo carrier who has just that vision.

Monday, January 15, 2018


THURSDAY, OCTOBER 19, 2017
UC Berkeley researchers report challenges, solutions facing artificial intelligence
C Berkeley to join major tech companies in advancing 5G networks
Electrical Engineering and Computer Sciences faculty members and researchers from UC Berkeley’s Real-Time Intelligent Secure Execution Lab, or RISELab, released a report Monday outlining challenges facing the progress of artificial intelligence technologies as well as some ways to address them.

RISELab, which launched in January, is the most recent installment of the campus computer science division’s ongoing tradition of five-year collaborative research labs. This particular lab is focused on machine learning, security and structuring computers, according to RISELab member and campus professor of computer science David Patterson.

Despite its youth, RISELab has many expert faculty members and leading researchers who have conducted research in computer science for most of their careers. Patterson, for instance, has dedicated his 40-year career to computer science development and has participated in about eight labs on campus.

In their report, consisting of about six months of work, RISELab’s researchers defined several major challenges facing technologies with artificial intelligence, or AI. One challenge lies within the hardware of computers as computer processor performance declines, according to EECS professor and RISELab director Ion Stoica.

“The amount of data that we have is increasing exponentially, but the capabilities of processors aren’t as fast as they should be,” Stoica said.

The “laws of physics,” RISELab co-founder and campus assistant EECS professor Joseph Gonzalez added, will not allow processors to be made faster, and there is a need for more resources to meet the demands of computation. According to Patterson, the decreasing efficiency of computer processors impedes AI development, and professionals and researchers will have to find a way to use machine learning to process larger amounts of data.

Gonzalez also identified AI security as a major concern, citing the growth of “big-data technology,” or technology that deals with unusually large sets of data. Stoica added that although using more data in computers is better for creating personalized accommodations — such as how Netflix uses big data to suggest movies for users — there is also an impact on the user’s privacy.

“As AI becomes more and more important, and has more impact on our lives, we need the systems to be not only intelligent, but also robust, explainable and secure,” Stoica said.

The possible solutions to combat these concerns listed in the report include optimizing processors to serve only one particular task, which would increase their productivity and speed, according to Stoica.


EECS and statistics campus professor Michael Jordan commented on the relevance of the report’s conclusions, stating that RISELab is part of a worldwide effort to improve AI development.

“We’re gonna see examples of artificial intelligence in all kinds of products, and without proper care, we would be compromising personal security,” Patterson said. “We should build things that work well, that people can enjoy and that are secure.”


  1.  meanwhile at Berkeley...


    https://rise.cs.berkeley.edu/


    RISELab Kicks Off

    Berkeley’s computer science division has an ongoing tradition of 5-year collaborative research labs. In the fall of 2016 we closed out the most recent of the series: the AMPLab. We think it was a pretty big deal, and many agreed.

    One great thing about Berkeley is the endless supply of energy and ideas that flows through the place — always bringing changes, building on what came before. In that spirit, we’re fired up to announce the Berkeley RISELab, where we will focus intensely for five years on systems that provide Real-time Intelligence with Secure Execution.

    Context
    RISELab represents the next chapter in the ongoing story of data-intensive systems at Berkeley; a proactive step to move beyond Big Data analytics into a more immersive world. The RISE agenda begins by recognizing that there are big changes afoot:

    1. Sensors are everywhere. We carry them in our pockets, we embed them in our homes, we pass them on the street. Our world will be quantified, in fine detail, in real time.
    2. AI is for real. Big data and cheap compute finally made some of the big ideas of AI a practical reality. There’s a ton more to be done, but learning and prediction are now practical tools in the computing toolbox.
    3. The world is programmable. Our vehicles, houses, workplaces and medical devices are increasingly networked and programmable. The effects of computation are extending to include our homes, cities, airspace, and bloodstreams.
    In short, the loop between data generation, computation, and actuation is closing. And this is no longer a niche scenario: it’s going to be a standard mode of technology going forward.

    Mission
    Our mission in the RISELab is to develop technologies that enable applications to interact intelligently and securely with their environment in real time.

    As in previous labs, we’re all in — working on everything from basic research to software development, all in the Berkeley tradition of open publication and open source software. We’ll use this space to lay out our ideas and progress as we go.

    Sponsors
    A final note: we’re extremely fortunate at Berkeley to be supported by — and working with — some of the world’s biggest and most innovative companies. The RISELab’s 11 founding sponsors are quite the crew: Amazon Web Services, Ant Financial, Capital One, Ericsson, GE Digital, Google, Huawei, IBM, Intel, Microsoft Research and VMware. Thanks to all.

    We RISE.

    — IonJoeJoeyRaluca and team

    FEATURED PROJECT
    Clipper
    Machine learning is being deployed in a growing number of applications which demand real-time, accurate, and robust predictions under heavy query load. However, most machine learning frameworks and systems only address model training and not deployment.

    Clipper is a general-purpose low-latency prediction serving system. Interposed between end-user applications and a wide range of machine learning frameworks, Clipper introduces a modular architecture to simplify model deployment across frameworks. Furthermore, by introducing caching, batching, and adaptive model selection techniques, Clipper reduces prediction latency and improves prediction throughput, accuracy, and robustness without modifying the underlying machine learning frameworks.
  2. jfieb

    jfiebWell-Known Member

    New


    [​IMG]
    https://rise.cs.berkeley.edu/
    [​IMG]
    [​IMG]
    RISE Camp 2017
    [​IMG]
    [​IMG]
    November 2017

    Welcome to our Fall 2017 Newsletter! This is the place in which we keep you informed about our recent progress.


    We had a great RISECamp at the beginning of the semester! Here is the web page, and here are the videos that we feel show the enormous success of the camp. People seemed to have a great experience learning in real time about what we are doing. You may ask yourself: How does one actually put on such an event? Making this happen smoothly and seamlessly can be quite complex, so to that end, Jey Kottalam has put together this nice blog describing some of the technical issues that went on behind the scenes.

    Two other items are of particular interest: one is that the RISELab faculty have put together a vision paper, which should give you an idea of some of our thoughts going forward, and the other is that RISELab faculty Michael Mahoney and Michael Jordan, along with statistics professors Bin Yu and Fernando Perez and EECS professor Richard Karp were awarded an NSF grant to create a Foundations of Data Analysis (FODA) Institute here at UC Berkeley. Details can be found hereThe FODA Institute will focus on theoretical foundations of data science at the intersection of computer science theory, statistics theory, and applied mathematicsHaving a mix of foundational theory inspired by and leading to real applications and implementations is one of the great aspects of the RISELab!Thank you for your continued support.

    Professor Michael Mahoney
    RISELab Faculty


    Notice mathematics. AI is more math.