Saturday, August 30, 2014

A snip from the Mouser blog

. With off-the-shelf sensor fusion and sensor hub chips, it is now possible to efficiently interface to a variety of digital sensors, as well as other pathways. The burden of creating your own algorithms has been eliminated.

a snip from QUIK 10Q filed June 29



. The

19


Item 2. Management's Discussion and Analysis of Financial Condition and Results of Operations — (Continued)

most recent example of a Catalog CSSP, announced during early 2014, was an Android KitKat compatible, context aware, ultra-low power sensor hub. We are placing a greater emphasis on developing and marketing Catalog CSSPs in the future.


So the S2 catalog item will be coming and what shape will they take?

1.  10 axis fusion for Indoor location.

2.  More context and gesture.

3. Some surprises.

Does this mean anything is wrong with the Tier 1 in house algos?  No they are separate sorts of things,  Brian Faith spoke to this in the last cc.  I read somewhere that an MCU company may have 150-200 different MCU types for various segments of the market.

QUIK's S2 should allow some sensor fusion to happen that was not possible on the S1 and we will find out in what the catalog offers.


Mouser has a nice read that I will take a look at.  From the Mouser blog

https://www.blogger.com/blogger.g?blogID=7562516522401363281#editor/target=post;postID=9025885410515629694


Sensor Fusion Comes of Age

By Morrie Goldman, Mouser Electronics 

Over time, sensors have morphed from simple analog and mechanical constructs to chip-based digital devices that connect to a machine to monitor the machine’s health as well as environmental conditions. Similarly, sensor fusion — multiple types of sensors working together to solve a problem — has combined the threads of many other technologies to create something very new and exciting.
The idea of using a computational device to sort out the data from multiple sensors and combine information to draw a conclusion has been around since at least the 1950s. But it was exceptionally difficult to do. Around 1960, several mathematicians developed sets of algorithms in an effort to have a machine draw a conclusion based on input from multiple sensors. These filters also removed meaningless data from noise or other sources. Of course, it wasn’t long before the military decided that this technology would be useful in their applications. Being able to process inputs from multiple sources and compare it with stored data would allow the military to better track and identify potential airborne targets and even compute the certainty of the results. With better computers and sensors, the technology was advancing, but there were still complex and expensive problems to solve.


Commentary; he is referring to Kalman filters here

Potential Applications

When the microprocessor first became available, people described it as a solution in search of problems. The same case can be made for sensor fusion. If you have the power and intelligence to monitor multiple sensors, analyze the data in real time, and either provide a simple direction or control an action, heretofore-unthought-of applications can be almost limitless. The following examples just scratch the surface:
  • Health monitoring — including healthy athletics, patient monitoring, and research
  • Monitoring the elderly — wellness monitoring to reduce the burden of staffing
  • Automotive, transportation systems — monitoring and controlling efficiency and safety functions
  • Public safety — identification of potential hazardous conditions with much greater accuracy than simple fire and security systems
  • Entertainment — gaming, including controllers and virtual reality headsets
  • Weather — intelligent weather forecasting stations, that not only warn of changing conditions, but control systems to prepare for a storm (for example, closing storm shutters, closing valves, etc.)
  • HVAC/Air Quality — intelligent control of room temperature, humidity, air quality, system maintenance, etc.
While all of these types of capabilities have existed in some form for many years, the ability of a system to observe multiple sensors and come to an intelligent conclusion, and even initiate action, is revolutionary.
Sensor Fusion Technology in activity monitors
Figure 1: Activity and other health monitors were among the first consumer products to embrace sensor fusion technology.
A Convergence of Technologies Fortunately, as they have done in other areas of electronics, a number of IC manufacturers have taken on the task of doing the heavy lifting. With off-the-shelf sensor fusion and sensor hub chips, it is now possible to efficiently interface to a variety of digital sensors, as well as other pathways. The burden of creating your own algorithms has been eliminated.
While their terminology does vary a bit, a number of IC manufacturers have either adapted existing lines of products or created entirely new ones to tackle sensor fusion tasks. The processing is done by a specialized controller chip, which may be identified as an MCU, a sensor hub or a sensor fusion processor. We are already seeing this technology applied in the consumer market in smart phones, activity monitors and other devices.
The latest generation of smart phones from Apple, Samsung, and others contain powerful and diverse sensing capabilities, even without the need for external interfacing. These include a three-axis magnetometer, a three-axis accelerometer, and a three-axis gyroscope. This combined capability is often referred to as 9-DoF, nine degrees of freedom.
For the most part, these functions are “always on” in a cell phone. If the processing of data from these sensors was managed by the phone’s central microcontroller, battery life would be significantly shortened. Instead, highly efficient dedicated MCU chips process the data as sensor hubs, using a fraction of the power. The NXP ARM M3 series of MCUs is one example. According to Chipworks, a product teardown specialist, as reported by EETimes, Apple uses a customized version of the NXP chip to monitor its sensors in the iPhone 5S. “The M7 controls functions from a variety of discrete sensors including a gyroscope, an accelerometer, and a compass.” Samsung takes on the same task with a microcontroller from Atmel, the Core 8-bit AVR MCU.
With such powerful on-board sensing technology, apps are appearing that take advantage of the 9-DoF cell phone hardware to provide health and activity monitoring, or to function in concert with GPS and external data to provide even more information for the user. Now, add to those already-diverse sensor inputs data from an external device that communicates via Bluetooth and the capabilities seem limitless. The goal of the chip manufacturers is to make it practical for engineers to design systems that provide real-time sensor data which can be used to provide the desired contextual awareness with minimal power consumption and maximum battery life. Beyond smart phones, highly optimized solutions can address such applications as tablets, Ultrabooks, IoT-enabled devices, gaming, healthcare, environmental monitoring, and wearable computing.
Development boards are available that allow design engineers to easily get their feet wet in this technology. One such example is the ATAVRSBIN2 by Atmel. Atmel has embraced sensor fusion with a wide variety of products, which they call “the Complete Sensor Ecosystem.” Atmel identified that the simultaneous analysis and fusion of data from different sensors and sensor types was not a task it could handle solo. To get past these complexities, the company partnered with a number of leading sensor manufacturers and sensor fusion specialists to provide a complete, easy-to-implement Sensor Hub Solution.


Commentary:  We want something just about like this, in the mix and match store.
A current trend combines an MCU with three or more MEMS sensors in a single package. One example is STMicroelectronics’ LIS331EB, which combines a high-precision three-axis, digital accelerometer with a microcontroller in a single 3 x 3 x 1 mm-package. The microcontroller is an ultra-low-power ARM Cortex-M0, with 64-Kbyte Flash, 128-Kbyte RAM, embedded timers, 2x I²C (master/slave) and SPI (master/slave). The LIS331EB can also internally process data sensed by external sensors (for a total of nine), such as for gyroscope, magnetometer, and pressure sensors. Functioning as a sensor hub, it fuses together all inputs with the iNEMO Engine software. STMicroelectronics’ iNEMO engine sensor fusion software suite applies a set of adaptive prediction and filtering algorithms to make sense of (or fuse) the complex information coming from multiple sensors.
Freescale also offers a product line of devices that combine MCUs and sensors in a single package. Their FXLC95000 Xtrinsic Motion-Sensing Platform integrates a MEMS accelerometer and a 32-bit ColdFire MCU. Similar to the STMicroelectronics device, the FXLC95000 can simultaneously manage data from internal and external sensors. Freescale was the first company to market an MCU with a sensing hub embedded that is also programmable for customer-specific applications and algorithms. Up to 16 sensor inputs can be managed by a single device, allowing calibration, compensation and sensor functions to be offloaded from the application processor. It functions with either Freescale or third-party drivers.
Freescale’s Xtrinsic FXLC9500 32-bit MCU
Figure 2: Freescale’s Xtrinsic FXLC9500 32-bit MCU Sensor Fusion Hub with Accelerometer enables scalable, autonomous, high precision multi sensor hub solutions with local compute and sensors management in an open architecture.
Other manufacturers who are well-entrenched include Bosch, Fairchild, Honeywell, MicroChip, and TI.

Fusion Meets the Cloud

While quite a lot of functionality can be achieved at a local level, interaction with the Cloud is where the fun really begins. Remote sensor data can be processed by a sensor fusion device and sent to the Cloud for recording, further analysis, or even to order an action.
For example, an unattended pump operating in a remote location is always at some risk of failing. A few years ago, a remote sensor may have been in place to identify if it were running hot or had even failed. Now, the same pump can also be monitored for vibration, exhaust chemistry, bearing noise, and the external conditions around it. A predetermined program could empower the sensor fusion controller to shut down the pump or even cycle its operation until a technician can arrive. The system would also know in advance whether it is likely that the entire pump must be replaced or just a component. Here, a sensor fusion solution could eliminate downtime as well as costly emergency service calls, and even collect data to analyze how well the pump is working overtime. The same general idea applies to monitoring an aircraft engine in flight, a building elevator, or just about anything mechanical.
Another application of the Cloud is for the sensor fusion to take place there, instead of on site. With open-source sensor fusion software available, individual sensor data can be transmitted to a server, where the processing would take place.

Conclusion

Sensor fusion is a technology that has come of age, and at just the right time to take advantage of developments in sensors, wireless communication, and other technologies. Once out of reach of all but the most advanced government labs, the technology is now available off-the-shelf, at prices that even fit into the BOM budget for many consumer products.
Now closely linked to mobile technology and the rapid development of lower-cost digital sensors, sensor fusion is poised for explosive growth. For the design engineer, it is a good time to apply some creative thinking and to start experimenting!
Morrie Goldman is a veteran electronics industry marketer and technical writer based in the Chicago area. A former editor and writer of electronic text content for a major educational company, he has also authored numerous articles for trade publications and websites. He holds an Advanced Class amateur radio license and has had military electronics training.



Very impressive blog that Mr Goldman wrote and keep the adjacent possible in your mind.  This is the
coral reef that QUIK has put itself into,  can't wait to read about the S2.

So a question, will QUIK ever have a sensor partner, who they work with for a one chip sesnor + fusion for say 10 axis indoor location allof it together and integrated?


Part of the title of the blog is based upon the Steven Johnson book entitled,  "Where good ideas come from".  where one of the chapters is entitled, the adjacent possible.  There are many other blogs that
speak to his ideas and will take a snip from one....

its the best book I have read in several years as far as very interesting and it explains an awful lot of what is going in in MEMS, sesnors, algorithms, and the evolution of the mobile devices......

http://www.theguardian.com/science/2010/oct/19/steven-johnson-good-ideas


At the core of his alternative history is the notion of the "adjacent possible", one of those ideas that seems, at first, like common sense, then gradually reveals itself as an entirely new way of looking at almost everything. Coined by the biologist Stuart Kauffman, it refers to the fact that at any given time – in science and technology, but perhaps also in culture and politics – only certain kinds of next steps are feasible. "The history of cultural progress," Johnson writes, "is, almost without exception, a story of one door leading to another door, exploring the palace one room at a time."
Think of playing chess: at any point in the game, several ingenious moves may be possible, but countless others won't be. Likewise with inventions: the printing press was only possible – and perhaps only thinkable – once moveable type, paper and ink all existed. YouTube, when it was launched in 2005, was a brilliant idea; had it been launched in 1995, before broadband and cheap video cameras were widespread, it would have been a terrible one. Or take culture: to 1950s viewers, Johnson argues, complex TV shows such as Lost or The Wire would have been borderline incomprehensible, like some kind of avant-garde art, because certain ways of engaging with the medium hadn't yet been learned. And all this applies, too, to the most basic innovation: life itself. At some point, back in the primordial soup, a bunch of fatty acids gave rise to a cell membrane, which made possible the simplest organisms, and so on. What those acids couldn't do was spontaneously form into a fish, or a mouse: it wasn't part of their adjacent possible.
If this seems completely obvious, consider, Johnson says, how it explains the otherwise spooky phenomenon of the "multiple" – the way certain inventions or discoveries occur in several places simultaneously, apparently by chance. Sun-spots were discovered in 1611 by four different scientists in four different countries; electrical batteries were invented twice, separately, one year apart. (Similar things happened in the earliest days of the steam engine and telephone.) People have tried to explain this using vague terms such as the "zeitgeist", or of certain ideas just being "in the air". But there's a simpler possibility, which is that the innovation in question had simply become part of the adjacent possible. Good ideas, as Johnson puts it, "are built out of a collection of existing parts", both literally and metaphorically speaking. Take the isolation of oxygen as a component of air, which was another multiple. It couldn't have happened before the invention of ultra-sensitive weighing scales. But it also couldn't have happened before the birth of the idea that air is something, rather than nothing, and that it might be made up of gases.
What all this means, in practical terms, is that the best way to encourage (or to have) new ideas isn't to fetishise the "spark of genius", to retreat to a mountain cabin in order to "be creative", or to blabber interminably about "blue-sky", "out-of-the-box" thinking. Rather, it's to expand the range of your possible next moves – the perimeter of your potential – by exposing yourself to as much serendipity, as much argument and conversation, as many rival and related ideas as possible; to borrow, to repurpose, to recombine. This is one way of explaining the creativity generated by cities, by Europe's 17th-century coffee-houses, and by theinternet. Good ideas happen in networks; in one rather brain-bending sense, you could even say that "good ideas are networks". Or as Johnson also puts it: "Chance favours the connected mind."



After reading this I came to the conclusion that QUIK, via its work in secret to make the S1 has moved itself into a coral reef of creativity, where they have some of the bits and pieces and
are in the mileu of the adjacent possible.  How to track to see if its true.  

1.  The news items of the devices we are in and any phrasing around that.
2.  Involvement in really ground breaking devices that may not have huge volume, say a Thalmic wrist band.  ANy hints of beta work in think tanks, like Google Nest,

3.  The news items of the algo work and what they enable.

4.  General GEEK noise/serious talk of always on and context aware,  QUIK's success IS linked to
this


Internet of Things Journal

Subscribe to Internet of Things Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Internet of Things Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn
News Feed Item

MEMS Industry Group Hosts 10th Annual MEMS Executive Congress US

Tech Business Leaders Explore MEMS and Sensors in Consumer, Environmental, Healthcare/Medical, Industrial Apps
PITTSBURGH, PA -- (Marketwired) -- 08/28/14 -- MEMS Industry Group (MIG) will host MEMS Executive Congress® US 2014, the annual business conference and networking event for the MEMS and sensors industry, November 5-7, 2014 in Scottsdale, AZ.
Spanning environmental sensors for safe drinking water to personalized healthcare, cybersecurity for connected systems, and spectral imaging in wearable devices, micro-electromechanical systems (MEMS) and sensors enable human-machine interactions in unprecedented ways.
"This year's MEMS Executive Congress US speakers reflect some of the most fascinating uses of MEMS and sensors in commercial applications," said Karen Lightman, executive director, MEMS Industry Group. "Ayasdi is tracking disease via smartphone. MoboSens uses a smartphone sensor to monitor nitrate in drinking water. VTT's microspectromers are likely to be used for skin cancer analysis, among many other uses. GE is advancing MRI. Wurldtech's technology protects critical infrastructure for oil and gas, smart grid and medical devices. And those are just a few of our speakers who will enlighten the C-level audience about maximizing the potential of MEMS and sensors -- the theme of this year's MEMS Executive Congress. Equally exciting, Congress attendees will also hear from entrepreneurs, academic innovators and one of the world's largest foundries, TSMC."
Keynotes
  • "The Next Big Thing" -- opening keynote by George Liu, director, Taiwan Semiconductor Manufacturing Company (TSMC)
  • "Getting to a Trillion Sensors: Why MEMS Engineers Have to Become Ecosystem Co-creators" -- closing keynote by Francis Gouillart, CEO, Experience Co-Creation Partnership
Featured Speakers
  • Opening and Closing Remarks -- Karen Lightman, executive director, MEMS Industry Group
  • "Managing the Challenges of Introducing MEMS into a Healthcare Product Line" -- a featured presentation by Tim Nustad, general manager and CTO, MR Healthcare Systems
  • "The Paradigm Shift of the Internet of Things and its Impact on and in the MEMS and Sensors Industry" -- a featured presentation by Chris Wasden, ‎professor of innovation & executive director, Sorenson Center for Discovery and Innovation at University of Utah; and former managing director of global healthcare innovation, PricewaterhouseCoopers
  • "Tracking the State of Disease Using Mobile Devices" -- a featured presentation by Pek Lum, CTO, Ayasdi
  • "A View of the MEMS Industry in 2014: Technologies, Trends and Talent" -- a featured presentation by Tom Kenny, professor of mechanical engineering, Stanford University
  • "MoboSens -Smartphone Micro Sensor for Accurate and Quantitative Environmental Water Analysis" -- a featured presentation by Manas Gartia, scientific research officer, MoboSens
  • "Security Imperative: How to Build Stronger Security Requirements into the Design Phase of Product Development" -- a featured presentation by Nate Kube, CTO, Wurldtech
  • "MEMS Microspectrometers -- Powering New Sensing Applications" -- a featured presentation by Anna Rissanen, research team leader, MOEMS and BioMEMS Instruments, VTT Technical Research Centre of Finland
Panels
  • "How MEMS Impacts the Healthcare Ecosystem and Enables Personalized Health" -- a panel discussion moderated by Mark Winter, CEO, CareSpan, with panelists:
    • Dr. Anita Goel, MD, PhD, chairman and CEO, Nanobiosym; physicist, Harvard-MIT MD-PHD
    • Stephen Whalley, chief strategy officer, MEMS Industry Group, and former director of sensors, Intel Corporation
  • "MEMS Market Spotlight," a panel discussion with:
    • JĂ©rĂ©mie Bouchaud, senior principal analyst, MEMS & Sensors and Industrial Electronics, IHS
    • Jean-Christophe Eloy, CEO and president, Yole DĂ©veloppement
Featured EventsFor the third year in a row, MIG's popular MEMS and Sensors Technology Showcase will give Congress attendees an up-close and personal experience with some of the most compelling MEMS- and sensors-enabled products ever invented.
MIG's second annual Elevator Pitch Session will give early-stage MEMS and sensors companies a platform for reaching potential investors.
MIG's annual Best in MEMS and Sensors Innovation Awards will celebrate outstanding achievements in the MEMS and sensors industry.
For the complete agenda, visit: http://us2014.memscongress.com/agenda/.

Friday, August 29, 2014

My homework for today?

A little knowledge of the M4 and sensor fusion...


Diya Soubra is a CPU Product Marketing Manager for Cortex-M ARM Processors at ARM. He has 20 years of experience in the semiconductor industry, during which he held various positions in engineering, product marketing and business management. Just prior to joining ARM, he worked with hi tech start-ups to develop their business. Before that he was in charge of product marketing for devices for VoIP and broadband gateways. He has also developed various software and hardware products for communication protocols and infrastructure systems while working for Rockwell Semiconductor, Conexant Systems and then Mindspeed Technologies. He received a B.S. in Electrical Engineering from the University of Nebraska at Lincoln, a Master of Science in Engineering from the University of Texas at Austin and his MBA from the Edinburgh Business School. He holds 1 patent.


Samsung''s gear 2?

Yup it was an ARM M4, must just burn the battery, but they must want some math unit to run some serious algos on it?



Energy efficient digital signal control

The Cortex-M4 processor has been designed with a large variety of highly efficient signal processing features applicable to digital signal control markets.  The Cortex-M4 processor features extended single-cycle multiply accumulate (MAC) instructions, optimized SIMD arithmetic, saturating arithmetic instructions and an optional single precision Floating Point Unit (FPU). 

commentary;  QUIK wants expertise in both Fixed point AND Floating point.  So the FFE is the fixed point unit (?) and the floating one will come with the M4 part of the SoC is my conclusion.


These features build upon the innovative technology that characterizes the ARM Cortex-M processor family.

Responsiveness and low power

In common with the other members of the Cortex-M family of processors, the Cortex-M4 has integrated sleep modes and optional state retention capabilities which enable high performance at a low level of power consumption. The processor executes the Thumb®-2 instruction set for optimal performance and code size, including hardware division, single cycle multiply, and bit-field manipulation. The Cortex-M4 Nested Vectored Interrupt Controller is highly configurable at design time to deliver up to 240 system interrupts with individual priorities, dynamic reprioritization and integrated system clock.   


QUIK's bits and pieces will allow it to be off most of the time, but still be always on and fusing data, other approaches will be the penumbral M4, a mic is left on, or one other sesnor is left on, they will call this always on ( mic) also, but it won't know where you are or what you are doing; there will be "always on" devices that don't know context and will not be aware in a useful sense.....................    

Easy-to-use technology

The Cortex-M4 makes signal processing algorithm development easy through an excellent ecosystem of software tools and the Cortex Microcontroller Software Interface Standard (CMSIS) .


Its 32 bit,
Maximizes software reuse?

So if they do a beta on an MCU it sill seem like what they know, very familiar?

Here is ARM IoT page


From Sensor to Server

The Internet of Things (IoT) is the collection of billions of end devices, from the tiniest of ultra-efficient connected end nodes or a high-performance gateway or cloud platform, intelligently connected and interoperating with servers and services. ARM’s technology’s breadth and diversity from silicon IP to software IP, combined with its partnership approach and ecosystem meet the needs of rapidly evolving secured interconnectivity of IoT, and provides the quickest path to market with connected chips and platforms. ARM drives and simplifies the current and future IoT applications and services to become truly ubiquitous and intelligent.


So, this is NOT so far away.  It must be the S3?  Does anyone think the S3 will NOT be the one with the ARM core?  Thanks in advance for any thoughts.


Connected objects for IoT pose a strong technical challenge with regards to power consumption.
The standard deployment scenario is a battery operated object that needs to last a few weeks if not months in the field without any service call or recharge.
Hence, unlike smart phones that need a recharge every day, or standard mobile phones that need a recharge every week, IoT objects have to operate for long periods without any recharge.

One idea that comes to mind, is a variation on the Big.Little scheme.
In that scheme, demanding tasks run on the big processor, everything else runs on the little processor. Total power is reduced dramatically since most operations require only the little processor.

Assuming an IoT connected object with an RF block, a Cortex-M4 and a few sensors, we can achieve the same dramatic power reduction by changing the clock frequency of the processor.
For most of the time, the processor would run at low clock frequency to service the sensors. Once every few minutes or based on specific events, the processor would change the clock setting to switch into high performance mode in order to transmit secure packages over the RF link. Overall power consumption is dramatically reduced since the processor would run at low clock frequency most of the time.

This sounds simple in theory, but the design has to take into account the whole system design including all the buses and peripherals attached to the CPU. Asynchronous bridges may be required to create distinct clocks to protect peripherals. Also, the change over must be clean to retain the integrity of the clock signal.
I have not done a full system design on this idea yet but I am sure there are other considerations such as settling time for the PLL after a change in frequency.

has any one seen a product on the market yet with a Cortex-M device that uses this technique?



So they are trying any way they can to make it work.  QUIK has the best bits and pieces for needs of
the IoT SoC?

That is the inference to gain subjective probabilities on?

QUIK had 7 jobs open and 5 of them were in software.  2 have been filled recently and any hint of what they were for will soon be gone.  Since this blog is NOT read very much I am free to discuss them a little.

Software test and release engineer

Primary role to fix bugs that might come up in the future and get the fixes out.

I take this as a foreshadowing of good things, ie there are devices( referenced in the cc) that need to have good service provided to them


and

Senior staff for SoC development.

they were to have a minimum of three ARM projects under their belts.
It will be an ARM4 core above the FFE.

It will be a platform, ie the one for smartphones will be different in some ways than the one for wearables.


Most wearables are teathered so far in that they use the smartphones radios for actual connection to the cloud, so we may not need it yet for the IoT SoC.

Security is discussed more and more now and ARM has their trustzone that we may use for encryption.
Good encryption HAS to be Hardware.\

5 jobs remain, 4 are in software. One for very advanced location- min. 10 yr experience.

2 for context and gesture.. a few snips...

 general purpose processor such as ARM M4.


It makes sense to me that they would go for the M4, others will try to save a little battery by using M0+ to M3, these have no math units and the savings is small, a LOT smaller than you would think.

The FFE will save the M4 and so it will sleep a LOT more than in others approach to an M4.

The inference is that QUIK has the bits and pieces to make a much better tiered intelligent SoC than most others in the race.

these 2 have a min of 17 yrs of experience.

the remaining software is for sware application engineer...8 yrs min experience, they will help fix bugs also.


35 man yrs will be added to the mix.

QUIK is allocating very well,  several jobs are to fix software glitches after a device is released.   It supports the conclusion there will be devices out there fairly soon, in support of the cc.  The flexibility of the S1 will get to be put to use
in comparison to some glitch that just requires a subsystem get shut down for an ASIC?


In house work on algos,  my recurring question is always...who is in this ecosystem.  Are they good at WIFi?  DO they have GPS, or beacons,  we just don't this this one ALL alone.

It's margin, its value, its good learning from Apical, own your IP where you can.










Here is the text for the Sensor part of CES 2015...


See Me, Touch Me, Feel Me: 2015 International CES to Feature Latest Innovations in Sensor Technology

CEA to partner with MEMS Industry Group to introduce new gesture and motion technology focused marketplace

Arlington, VA – 07/31/2014 – The Consumer Electronics Association (CEA)® today announced the Sensors Marketplace, presented by the MEMs Industry Group, a showcase of the latest innovations in gesture and motion technology, at the 2015 International CES®. Owned and produced by CEA, the International CES is the world’s gathering place for all who thrive on the business of consumer technologies. The 2015 CES is scheduled to run January 6-9, 2015, in Las Vegas, Nevada.

The Sensors Marketplace will showcase leading innovation in the motion technology field, including MEMS, the smallest motion-sensing devices. The area will highlight gesture and motion recognition technology

QUIK will you be ready for this highlight? Hope so

, and applications in automotive, aerospace, medicine, robotics and more, that will demonstrate how sensors enhance the technology experience and the world around us. It will also contribute to the overall story of the Internet of Things (IoT) ecosystem at the 2015 CES. The Sensors Marketplace will be located at CES Tech West within the Sands Expo, and sensor technology also will be highlighted across the 2015 CES show floor within various Marketplaces, including Fitness & TechnologyHealth & WellnessSmart Home and Wearables.

“Sensors are playing an increasingly important role in consumer electronics, as evidenced by the growing number of products containing the technology – from smartphones and fitness trackers to clothing, toys and even cars,” said Karen Chupka, senior vice president, International CES and corporate business strategy, CEA. “We’re excited to bring this experience to 2015 CES attendees, to showcase how integral  motion and gesture recognition technologies are as they provide crucial real-time data and increased personalization that is improving the way we live, work and play.”

Sensor technology exhibits at the 2015 CES are expected to span 2,500 net square feet of space in the heart CES Tech West, which includes the Sands Expo (Sands), The Venetian, The Palazzo, Wynn Las Vegas and Encore at Wynn (Wynn/Encore). Major sensor and gesture recognition exhibitors include: Hillcrest LaboratoriesInvenSense, Inc.MEMS Industry GroupSoftKinetic and Tobii Technology AB.

CES Tech West also will be home to other major destination areas including the newest innovations in lifestyle technologies, including sports, fitness and health tech, 3D printing, smart home, startups and other high-growth technologies.

The 2015 CES will feature more than 3,500 exhibitors unveiling the latest consumer technology products and services across the entire ecosystem of consumer technologies. Companies interested in exhibiting in the Sensors Marketplace should contact Tira Baror at tbaror@CE.org.