My Favorites from CES 2017

I spent a couple of days at CES 2017 last week.  Two days is tight for CES and I missed a few things that were in my to-visit list but it was still totally worth the trip.  A selection of my photos are in Flickr, but here are some highlights:

Most Guts – Prosthesis

IMG_9517 2.jpgThis is an exoskeleton designed for racing.  It has four mechanical limbs controlled by the racer within, in mostly prone position, one per leg and arm (I uploaded a video of the movement).  The project is now part of Furrion Robotics; the vision of the designer (Jonathan Tippet) is a racing league.  The current version was completed just before CES 2017 and still does not have the actuators on it but here is a 1/3 scale version of the left arm actuator:

IMG_9523.JPG

It’s very impressive and feels properly nutty… in photos and even more in real life.  I created an album just for it.

Also check out the original web site, the Indegogo project, and the Gizmag/New Atlas article.

Geekiest Demo – Qualcomm Drive Data Platform

Qualcomm had a very impressive demo of their Drive Data Platform.  They use a Snapdragon 820Am with a camera, a fast (Cat 12) modem, and a neural network app using the Snapdragon Neural Processing Engine.   Here is the (simple) camera setup:

IMG_9564.jpg

DDP uses the camera and SNPE to recognize the objects around the car in real-time, especially buildings and the such that bounce the GPS signals and limit the accuracy of the GPS.  Then DPP can filter out all but the direct signals from the satellite and it can determine the location of the car very accurately and quickly.   Here is a screenshot showing all the buildings that are being detected in real-time by DPP.

IMG_9562.jpg

The (very accurate) location can then be used, again with computer vision, to determine the location of other elements in the field of  vision of the camera, like the lanes in the freeway.  This information can be shared with the map system.  And can be crowdsourced!

Repeat all this, and you can get very accurate maps very quickly.  I asked how large a sample they would need to do this and the presenter indicated that a large manufacturer would be able to do this by itself.   Think about the implications for the mapping ecosystem!

Here is the Qualcomm Blog Post on the DDP.  I’m sorry I didn’t take a video of the whole demo, but it was very impressive.  I think Qualcomm is doing a bunch of things right.

AI at CES

The main reason I went to CES this year was to check out all the work on AI / ML / Neural Networks.  I believe that in a couple of years we all will be using these things routinely in our apps.  Some of it will be at-the-edge, some will be on the cloud.  The DDP is an example of this.  The Snapdragon Neural Platform Engine (SNPE) was being demoed by itself elsewhere in the Qualcomm booth and it was very impressive.  Its designed so it can leverage the CPU, the GPU and the DSP on the Snapdragon, very neat and fast. Very interesting times ahead.

Coolest Area – Eureka Park

The first floor of the SandsExpo contains the Eureka Park.  I think this year, CES did a great job with it.  It had areas for different types of startups: early-stage, mid-stage, University-backed,  different non-US locations, Indiegogo, etc.  It was very busy, with some very interesting ideas and some not-so, chaotic and fun.  Quite different to the more organized floors elsewhere at CES.  Here is the map:

IMG_9586.JPG

The area had a bunch of things.  Something that caught my attention was the BioMindR – they leverage wireless signals and machine learning to continuously sensor hydration, glucose and fluid levels without contacts. It reminded me, at a very different level, of the work at MIT that is behind Emerald.

IMG_9600.jpg

There were many other interesting booths.  I particularly enjoyed talking with the Sensel folks but there were many more.  The Beon camera was also fun – I’m not convinced as a wearable in the wrist, but there should be a good fit for it somewhere.

Et Cetera…

I’ll leave you with some more pictures:

Can you figure out how does this work?  Click on the image to see the video clip.  The hint is: the demo was in the Nidec booth, they specialize on motors, bearings, robotic transporters, etc, etc.  Their motors are in theAutel Robotics Drones (very nice, they announced a deal with FLIR on dual thermal/visual cameras), and in many others.

742804DD-E8D6-45F3-AA39-85F0D3DF3A1B.png

And, from the very large and visited Xiaomi Mi booth, to show that they are serious, Mr. Hugo Barra:

IMG_9488.jpg

Xiaomi was testing the waters on them coming to the US.  The prices were amazing.  I would not buy one of their phones but they had plenty of other things I’d consider purchasing, especially this electric foldable bicycle – listed at the booth for $430!

IMG_9486.JPG

Ah, and the LG 4K OLED monitors were excellent.  I’m not a monitor guy but they were very very impressive!

Bluetooth 5.0 is beginning to show up and so is Thread.  Nordic’s nRF52840 is “ready” for both standards.  Its hard for me to predict the traction for Thread, but I expect we will see Bluetooth 5.0 everywhere soon.

Worth The Trip

Totally worth the trip to Las Vegas.  As in previous trips, the best thing are always the conversations with the booth guys.  Special thanks to the people from Qualcomm, Intel, Pikazo, Nordic, Beon, Nidec, Shenzhen Minew, Prosthesis, AppMyHome, Sensel, Orangie, RetailNext, Autel, ChargePoint and many more.

And check the Flickr album if you want to see more picts and additional commentaries.

NativeScript and Modern Sensors and Instruments

NativeScriptAndInstruments-Images.jpg

Modern applications are increasingly leveraging a multiplicity of sensors to connect the physical realm to the traditional software realm to gain multiple benefits, from efficiencies of operation, to security and safety, to new functionality.

Here are some real-live examples:

  • Sensors like RFID Tags allow Zara, the largest retail apparel in the world, to track their inventory accurately and in real-time, from the time it arrives in the trucks to the time it leaves with a customer, providing asset control, inventory availability, smart sale recommender, connection to social media, automatic reorders, and more.
  • An Industrial Equipment Vendor like TVH can track use data from GPS and ODBII sensors to track cars, their use, speed, acceleration and deceleration, gas consumption, battery status, and reduce operating costs from energy to maintenance to insurance.
  • Kingslake, a Progress customer in Sri Lanka, provides an application for managing the transportation of employees to factories using independent mini-bus operators. The employees use NFC tags to identify themselves to the operators which read them using rugged Android devices that can then use GPS data and connect to the backend to validate routes, provide accountability, even notify the factory if an employee won’t show up for their shift.
  • An event, like our own ProgressNEXT or the CSUMB Capstone Festival, can use a combination of RFID tags and readers and iBeacons, mediated through Mobile Applications, to provide personalized event information as well as tracking, logging and authorizing event data.

In the above examples, the server side services can be created on top of platforms like Rollbase, Modulus and the Telerik Platform, using data services like Progress Open Edge, and many SAAS services, and typically will interact with the clients through internet standards like HTTPS and Web Sockets.  The sensors themselves are part of, or interact with, mobile devices like smartphones and smart watches, and fixed devices like readers and instruments, like smart door locks, access control readers, etc.

There are a few key OS platforms for these devices.  The smallest sensors still run in “traditional” Real Time Operating Systems.  A number of handhelds and instruments used to run on Windows CE  but Microsoft has ended its life transitioning to Windows 10 IOT, but we many mobile devices are increasingly based on Android, while fixed devices are moving towards Linux – and Android Things was just announced.

NativeScript is particularly well-suited to this space.  NativeScript today runs well on iOS and Android (see note below) and provides very efficient and timely access to any platform features, including new libraries created to access the sensors.  The NativeScript metadata-driven machinery exposes these libraries at the JavaScript level and then they can be wrapped through a Plugin into a cleaner abstraction. This means that JavaScript and CSS developers can then write applications leveraging these sensors.

Note – Beyond iOS and Android, Progress/Telerik also has been doing work on the NativeScript port for Windows Universal (for Tablets and Desktops), and there is a separate project on porting it to Linux leveraging GitHub Electron Shell.

Here are some links to recent projects that are built on this setup:

  • The Invengo XC-1003 is an Android (KitKat)-based mobile RFID reader. The RFID reader is available to Android apps but Mehfuz wrote a NativeScript Plugin on top of it, and then a simple App using it.  Check out the blog post, the video and the GitHub  repo.  The App mimics a retail situation and is actually using 4 plugins: a RFID Reader, a iBeacon Reader (also written by Mehfuz), Google Maps, and SQLite for local storage, as well as checking with a (node.js-based) service for content.
  • An App for Fleet Management that uses the Bluetooth APIs to talk to an ODBII BLE sensors (like this one), and which can then present the car data, or push it to the cloud. Check out the blog post, video and GitHub repo.
  • We also have done several variations of an App to support Apps:
    • A simple Meetup App where RFID tags are attached to Badge Holders and then are used to track attendance and to run a raffle. Check the blog post.
    • Another, more complex version of the RFID setup was used at the ProgressNEXT 2016 event in Las Vegas. The ProgressNEXT application did more sophisticated server-side processing using the AMTech IOT  See the post
    • The latest version used both RFID tags (using the AMTech Platform) and iBeacons (via a plugin) and was used in the CSUMB Capstone festival.
  • Finally, the setup is identical to the one used by Kingslake in the employee transportation project mentioned at the beginning of this writeup. Kingslake is currently using a Native App for the client used by the Android smartphone, but it will be straight-forward to replace it with a more capable NativeScript solution.
  • Our TODO pile includes two other projects:
    • A Plugin for the Keonn AdvanReader 10, which is a flexible USB-connected RFID reader that can be used from an Android device. This would be very suitable for retail.
    • Running a NativeScript application directly inside a ThingMagic Sargas (blog, product page). This would require using the advanced built for NativeScript that is built on top of GitHub Electron Shell.

The future belongs to this class of applications: fully connected in real-time to our world, with substantial computational power – able to run the new AI algorithms, including Machine Learning – and connected to the internet and our enterprises assets.  NativeScript is an excellent tool to use in this environment.

Placemeter and Traffic in The Willows

[I started this project back in November 2015.  The project started nicely but never reached full speed and I didn’t find a natural time to write about it… until now]

In Fall 2015 I discovered Placemeter [Washington Post, TechCrunch, WebSite], a startup in NYC that used Computer Vision to track pedestrian and vehicular traffic.  At that point our local neighborhood was seeing a lot of car cross-through traffic, so I started a project: use Placemeter services to track traffic in several key locations in the neighborhood, correlate the traffic to infer flow, and then use this to have a more educated conversation with our city council.


PlacemeterFront.png

Placemeter has two different operating modes.  In both modes different areas of the field of vision are marked for analysis, either as a turnstile, to track objects going through it, in both directions, or as a polygon, to track objects going in and out of the area delimited by the polygon.   Placemeter applies CV algorithms to count the objects, to do object classification (pedestrians, cars, trucks, bicycles), and to do speed analysis.

Continue reading “Placemeter and Traffic in The Willows”

Linux Inside … ThingMagic Sargas

img_8545

Our ThingMagic Sargas just arrived.  This is a small (87mm x 80 mm x ) fixed RFID reader that packs a nice punch.  It has 2 high performance UHF RFID antenna ports capable of reading 750 tags a second at distances over 9 meters and an ethernet port, but it also has a BLE, USB, 4 GPIOs, micro-SD, and HDMI.  Inside there is a 1GHz ARM Cortex 8 running Linux (Debian) where you can run your own code.

Continue reading “Linux Inside … ThingMagic Sargas”