Thursday, November 5, 2015

tiers of sensor intelligence

Discussion in 'Main Forum' started by jfiebOct 29, 2015.

  1. A long term project and can be skipped as a digression.
    [​IMG]

    Ricks Churhill quote.


    “Now this is not the end it is not even the beginning of the end but it is, perhaps the end of the beginning.” -Winston Churchill

    A great quote that works very well. Consider sensor fusion intelligence.


    I have used the rice terreaces as a mental model for several yrs now, but the same quote applies.
    Rick's nice statement in the software segment speaks of the margins of software IP-how good they are.

    We have an MCU now above the FFE. QUIK will use it to the MAX for intelligence.

    Consider that sensory is ahead of the pack by one major decision they made yrs ago.
    Keep audio local, deeply embedded on the device and then it becomes a UI even if you have no connection.
    That meant they focused on power well before most others. It is a great fit- QUIK and sensory.


    On math units... they are not equal, I won't go in to all the headache inducing details, but a good yrs of reading told me the following....

    Kalmans ( a kind of math ) filters are the basis of sensor fusion, to be low power you need a math unit and so most, but not all, sensor fusion hubs use the M4 for its floating point math unit......BUT Kalmans- if given an ideal- use FIXED point math unit. So even though no one told me, the technology told me that the FFE is a fixed point unit made for running Kalmans, BETTER than an M4 floating point.

    The EOS then has 2 - count em math units, one for basic fusion in the FFE and the FP unit in the MCU.
    A run of the mill MCU has the floating point only, not 2 math units as the Eos must have.


    To maintain margins QUIK will layer on intelligence just like the rice terraces. So this thread will be my efforts to explore what these layers of intelligence might look like.....

    this is just to plant the seeds that the roadmap will be very interesing as it is the beginning of intelligence above the Fusion- expect that QUIK is ON this already. A lot will be done in the cloud, but audience has shown us that there can be real benefit for intelligence that still resides ON the device.

    I will be looking for that- more intelligence that resides on the device, and am using the audio engine as a mental model.

    So if this is something thatsounds interesting, read along.
    This is how I have fun as a part owner of the QUIK biz.
    If you find it gives a furrowed brow, or causes a headache it is not mandatory reading
    & can be skipped, while waitng for what sounds like a very nice CES.
     
    Last edited: Saturday at 3:06 AM
    John likes this.
  2. jfieb

    jfiebMember

    [​IMG]




    QUIK will add layers of intelligence on top of what we have now.
    Sensory shows us the benefit of keeping some intelligence local.
     
  3. jfieb

    jfiebMember

    Sensory shows us the benefit of keeping some intelligence local


    Local machine learning.....on sensor data!

    I was uncertain if we could keep it local
    Android Marshmallow adds 'high fidelity sensor support' flag for developers
    Google is empowering developers with new tools to better deal with wonky sensors, which have long been a sore spot of Android fragmentation.

    According to the Android 6.0 Compatibility Definition Document, devices whose sensors are accurate to within very strict tolerances can set a new flag: android.hardware.sensor.hifi_sensors. Devices whose accelerometer, gyroscope, compass, barometers, step detectors, etc. all deliver data with high accuracy and broad range must set this flag. This is a boon for developers, who can look for a single value and know it can rely on the sensor data being accurate (or at least, put up a warning message to users that their device might deliver a sub-par experience).

    Currently, developers can look at various flags to determine if a device as a particular sensor or not, but they have no way of knowing if it delivers precise, low-latency data.

    The Android compatibility document also lays out power requirements so hardware manufacturers can build in sensors that will work as Google intends. Devices don't have to meet the new requirements, they're entirely optional, but the existence of a standardized way to tell developers they can rely on sensor data being accurate and fast will be a big help in cutting down one of the prominent pain points in Android fragmentation.

    Why this matters: One of the reasons certain apps appear first, or only, on iOS is because iPhones deliver very consistent sensor data from one model to the next. Android phones, in being so diverse, often give wildly different results. What's more, developers can't rely on the results being delivered in a timely, low-latency fashion. Google is making it easier, if optional, for developers to know if a device provides high-quality, reliable, fast sensor data. In time, you may start seeing applications that are especially sensor-dependent (like running trackers or motion-sensitive games) throw up a warning on devices that don't set the android.hardware.sensor.hifi_
     
  4. jfieb

    jfiebMember

    Can be skipped, but I want a solid foundation.
    For what some ask?

    For EOS 2. The intelligence goes up


    STANFORD UNIVERSITY
    Machine Learning

    About this Course
    Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it. Many researchers also think it is the best way to make progress towards human-level AI. In this class, you will learn about the most effective machine learning techniques, and gain practice implementing them and getting them to work for yourself. More importantly, you'll learn about not only the theoretical underpinnings of learning, but also gain the practical know-how needed to quickly and powerfully apply these techniques to new problems. Finally, you'll learn about some of Silicon Valley's best practices in innovation as it pertains to machine learning and AI. This course provides a broad introduction to machine learning, datamining, and statistical pattern recognition. Topics include: (i) Supervised learning (parametric/non-parametric algorithms, support vector machines, kernels, neural networks). (ii) Unsupervised learning (clustering, dimensionality reduction, recommender systems, deep learning). (iii) Best practices in machine learning (bias/variance theory; innovation process in machine learning and AI). The course will also draw from numerous case studies and applications, so that you'll also learn how to apply learning algorithms to building smart robots (perception, control), text understanding (web search, anti-spam), computer vision, medical informatics, audio, database mining, and other areas.

    In what form will local (/deeply embedded ) learning occur on a mobile device.

    Sensory shows us the benefit of keeping some intelligence local

    + Dr. Saxe snip of text.

    Whats good ?

    Machine learning will be a margin maintaining adjacent possibility. It a room that QUIK gets to explore starting now in quiet mode, with some things to read in the coming months.
     
  5. jfieb

    jfiebMember

    New
    meanwhile later this month

    “An Ounce of Preprocessing Is Worth a Pound of Computing”

    Wearable Sensors and Electronics 2015
    Annual Conference and Exhibition
    November 16 – 17, 2015
    Santa Clara, California


    Conference Topics
    Who Should Attend
    Conference Location
    Hotel Information
    Testimonials
    Contact Information
    Sponsorship Opportunities
    Exhibit Opportunities
    Registration
    Click to view participating companies

    [​IMG]
    This unique event has an exclusive focus on sensors and electronics for wearable applications. Following the PC and smartphone waves of development, wearables are the "next big thing". And what makes various wearable designs unique is the ability to pack advanced sensors and electronics into very small form factors. Sensors are truly the enabling technology for these applications -- wearables simply cannot exist without sensors.

    So what are the next-generation wearable sensing technologies? And what are the applications that are driving the need for new types of sensors? These are the key questions that will drive the discussion at this second annual conference, Wearable Sensors and Electronics 2015, which will feature talks from the leading wearable sensors and electronics technology experts. The future is bright for wearables -- attend this event to identify emerging technology and application trends, exchange ideas, form new companies, and network with your industry peers!

    Conference Topics

    • Worldwide wearable device trends: market drivers, demographic factors, emerging markets and applications, disruptive technologies, government policy effects.
    • Business aspects: competitive forces and dynamics, pricing trends, mergers and acquisitions, analyst forecasts and projections, manufacturing developments, technology transfer, regulatory compliance, ecosystems and hubs, company formation.
    • Technology trends and developments: wearable device architecture, sensor hubs, ultra-low power systems and components, energy harvesting, micro batteries and energy storage, supercapacitors, sensor fusion, software algorithms, context awareness, virtual sensors, connectivity with smartphones.
    No wonder QUIK is going to be there?
    • Emerging applications: digital health, body area networks, medical diagnostics and screening, genomics, safety and security, environmental, virtual reality, indoor navigation, quantified self, usage paid insurance.
    • Emerging manufacturing techniques and materials: flexible and printed electronics, smart glass, streamlined assembly techniques.
    • Emerging types of sensors: touch, pressure, thermal, radiation, humidity, chemical, high- performance image and IR, air and pollution, magnetic, water, radar, high performance inertial, high performance microphones, microphone arrays.
    • Emerging types of actuators: high performance micro speakers, optical zoom, micro shutters, energy harvesters.
    Who Should Attend
    • CEOs
    • CTOs
    • Entrepreneurs
    • VPs of engineering
    • VPs of marketing
    • VPs of business development
    • MEMS foundry managers
    • Product managers and engineers
    • Marketing, sales and business development managers
    • MEMS and sensors production managers and engineers
    • MEMS and sensors design managers and engineers
    • Business development professionals
    • Industry analysts
    • Investment bankers
    • MEMS and sensors consultants
    • Media representatives

No comments:

Post a Comment