Joe uses his broadcaster’s voice in the latest DevOptics Deliver screencast. Check it out at the CloudBees YouTube Channel. And ping me if you are interested in the product, or sign up for a demo here.
This last week was Jenkins World 2017, at the Marriott Marquis, in San Francisco, and CloudBees announced several new products, including one I’ve been working for the last few months: CloudBees DevOptics.
CloudBees DevOptics provides an enterprise view of the Software Delivery Process, correlating and integrating data from different groups and tools, in a way that will let you gather real-time actionable insights into your process.
Last week we announced CloudBees DevOptics Deliver, which is focusing on tracking the flow of changes through your Sofware Process. We leverage the ubiquity of Jenkins as the premier automation engine. Jenkins is everywhere – in all companies – and everywhere – in all places inside all companies.
CloudBees DevOptics is a SaaS service that uses a Data Collector Plugin for Jenkins to provide information on your software process, which is then processed and then presented in a way that is useful to you.
We gave an initial view on this during the Jenkins World keynote and later at the booth. The reaction was very positive – as in, I was aphonic at the end of the day! Here is a pict of me waving my hands around an explanation…
I joined CloudBees earlier this year to work on this product. I think it is a great opportunity, and I am having a blast! More details in future posts here and in the CloudBees blog, and, in the meantime, … you can request a demo.
Its been a long run since the first Jenkins User Conference, back in October 2011, at the Marines’ Memorial Hotel and Club
Ping me if you are around, attending, or just want to chat. At this point I’m planning to be there for the actual conference, on Wed and Thu; whether I’ll be there Mon or Tue, workshop days, will depend on the usual Gods of Software 🙂
See you around!
I’m overdue for an update on my work status. I started working at CloudBees in March but I’ve been going full speed since then and I’ve not had much time to write. JenkinsWorld is at the end of the month, so we are not exactly slowing down, but I want to get my update before then.
Progress Software announced a New Strategic Plan during its FY Year End report (link) and: “Progress intends to reduce headcount by approximately 450 employees, totaling over 20% of the Company’s workforce.” So, I went looking, found different options and… CloudBees, I choose you!
There are many reasons for choosing CloudBees…
- CloudBees is the commercial home for Jenkins (home, wikipedia), the ubiquitous Automation Server,
- Software is Eating the World and Jenkins is the main automation engine driving this, There are many great opportunities around CloudBees and Jenkins…
- I know, have worked with, and I am friends with many of the CloudBees folks. Kohsuke, the creator of Hudson/Jenkins used to work for me, I have worked with Harpreet, Vivek, Dave, Alyssa, John (and others?) at Sun, with Steve at Oracle, and over the years I’ve been at the other side of many chats with Sacha and others.
- I had an opportunity to try a new role, engineering manager, with a great team, in a great product…
So, here I am, at CloudBees. I work from the San Jose office, by the SJO airport
Kohsuke and Harpreet are also based there, and so is Womby (Why Wombats?)
We have a great team! And, and we have some openings. We need:
Ping me if you want to work with me and us. As somebody said: Kick Butt and Have Fun!
I spent a couple of days at CES 2017 last week. Two days is tight for CES and I missed a few things that were in my to-visit list but it was still totally worth the trip. A selection of my photos are in Flickr, but here are some highlights:
Most Guts – Prosthesis
This is an exoskeleton designed for racing. It has four mechanical limbs controlled by the racer within, in mostly prone position, one per leg and arm (I uploaded a video of the movement). The project is now part of Furrion Robotics; the vision of the designer (Jonathan Tippet) is a racing league. The current version was completed just before CES 2017 and still does not have the actuators on it but here is a 1/3 scale version of the left arm actuator:
It’s very impressive and feels properly nutty… in photos and even more in real life. I created an album just for it.
Geekiest Demo – Qualcomm Drive Data Platform
Qualcomm had a very impressive demo of their Drive Data Platform. They use a Snapdragon 820Am with a camera, a fast (Cat 12) modem, and a neural network app using the Snapdragon Neural Processing Engine. Here is the (simple) camera setup:
DDP uses the camera and SNPE to recognize the objects around the car in real-time, especially buildings and the such that bounce the GPS signals and limit the accuracy of the GPS. Then DPP can filter out all but the direct signals from the satellite and it can determine the location of the car very accurately and quickly. Here is a screenshot showing all the buildings that are being detected in real-time by DPP.
The (very accurate) location can then be used, again with computer vision, to determine the location of other elements in the field of vision of the camera, like the lanes in the freeway. This information can be shared with the map system. And can be crowdsourced!
Repeat all this, and you can get very accurate maps very quickly. I asked how large a sample they would need to do this and the presenter indicated that a large manufacturer would be able to do this by itself. Think about the implications for the mapping ecosystem!
Here is the Qualcomm Blog Post on the DDP. I’m sorry I didn’t take a video of the whole demo, but it was very impressive. I think Qualcomm is doing a bunch of things right.
AI at CES
The main reason I went to CES this year was to check out all the work on AI / ML / Neural Networks. I believe that in a couple of years we all will be using these things routinely in our apps. Some of it will be at-the-edge, some will be on the cloud. The DDP is an example of this. The Snapdragon Neural Platform Engine (SNPE) was being demoed by itself elsewhere in the Qualcomm booth and it was very impressive. Its designed so it can leverage the CPU, the GPU and the DSP on the Snapdragon, very neat and fast. Very interesting times ahead.
Coolest Area – Eureka Park
The first floor of the SandsExpo contains the Eureka Park. I think this year, CES did a great job with it. It had areas for different types of startups: early-stage, mid-stage, University-backed, different non-US locations, Indiegogo, etc. It was very busy, with some very interesting ideas and some not-so, chaotic and fun. Quite different to the more organized floors elsewhere at CES. Here is the map:
The area had a bunch of things. Something that caught my attention was the BioMindR – they leverage wireless signals and machine learning to continuously sensor hydration, glucose and fluid levels without contacts. It reminded me, at a very different level, of the work at MIT that is behind Emerald.
There were many other interesting booths. I particularly enjoyed talking with the Sensel folks but there were many more. The Beon camera was also fun – I’m not convinced as a wearable in the wrist, but there should be a good fit for it somewhere.
I’ll leave you with some more pictures:
Can you figure out how does this work? Click on the image to see the video clip. The hint is: the demo was in the Nidec booth, they specialize on motors, bearings, robotic transporters, etc, etc. Their motors are in theAutel Robotics Drones (very nice, they announced a deal with FLIR on dual thermal/visual cameras), and in many others.
And, from the very large and visited Xiaomi Mi booth, to show that they are serious, Mr. Hugo Barra:
Xiaomi was testing the waters on them coming to the US. The prices were amazing. I would not buy one of their phones but they had plenty of other things I’d consider purchasing, especially this electric foldable bicycle – listed at the booth for $430!
Ah, and the LG 4K OLED monitors were excellent. I’m not a monitor guy but they were very very impressive!
Bluetooth 5.0 is beginning to show up and so is Thread. Nordic’s nRF52840 is “ready” for both standards. Its hard for me to predict the traction for Thread, but I expect we will see Bluetooth 5.0 everywhere soon.
Worth The Trip
Totally worth the trip to Las Vegas. As in previous trips, the best thing are always the conversations with the booth guys. Special thanks to the people from Qualcomm, Intel, Pikazo, Nordic, Beon, Nidec, Shenzhen Minew, Prosthesis, AppMyHome, Sensel, Orangie, RetailNext, Autel, ChargePoint and many more.
And check the Flickr album if you want to see more picts and additional commentaries.
Modern applications are increasingly leveraging a multiplicity of sensors to connect the physical realm to the traditional software realm to gain multiple benefits, from efficiencies of operation, to security and safety, to new functionality.
Here are some real-live examples:
- Sensors like RFID Tags allow Zara, the largest retail apparel in the world, to track their inventory accurately and in real-time, from the time it arrives in the trucks to the time it leaves with a customer, providing asset control, inventory availability, smart sale recommender, connection to social media, automatic reorders, and more.
- An Industrial Equipment Vendor like TVH can track use data from GPS and ODBII sensors to track cars, their use, speed, acceleration and deceleration, gas consumption, battery status, and reduce operating costs from energy to maintenance to insurance.
- Kingslake, a Progress customer in Sri Lanka, provides an application for managing the transportation of employees to factories using independent mini-bus operators. The employees use NFC tags to identify themselves to the operators which read them using rugged Android devices that can then use GPS data and connect to the backend to validate routes, provide accountability, even notify the factory if an employee won’t show up for their shift.
- An event, like our own ProgressNEXT or the CSUMB Capstone Festival, can use a combination of RFID tags and readers and iBeacons, mediated through Mobile Applications, to provide personalized event information as well as tracking, logging and authorizing event data.
In the above examples, the server side services can be created on top of platforms like Rollbase, Modulus and the Telerik Platform, using data services like Progress Open Edge, and many SAAS services, and typically will interact with the clients through internet standards like HTTPS and Web Sockets. The sensors themselves are part of, or interact with, mobile devices like smartphones and smart watches, and fixed devices like readers and instruments, like smart door locks, access control readers, etc.
There are a few key OS platforms for these devices. The smallest sensors still run in “traditional” Real Time Operating Systems. A number of handhelds and instruments used to run on Windows CE but Microsoft has ended its life transitioning to Windows 10 IOT, but we many mobile devices are increasingly based on Android, while fixed devices are moving towards Linux – and Android Things was just announced.
Note – Beyond iOS and Android, Progress/Telerik also has been doing work on the NativeScript port for Windows Universal (for Tablets and Desktops), and there is a separate project on porting it to Linux leveraging GitHub Electron Shell.
Here are some links to recent projects that are built on this setup:
- The Invengo XC-1003 is an Android (KitKat)-based mobile RFID reader. The RFID reader is available to Android apps but Mehfuz wrote a NativeScript Plugin on top of it, and then a simple App using it. Check out the blog post, the video and the GitHub repo. The App mimics a retail situation and is actually using 4 plugins: a RFID Reader, a iBeacon Reader (also written by Mehfuz), Google Maps, and SQLite for local storage, as well as checking with a (node.js-based) service for content.
- An App for Fleet Management that uses the Bluetooth APIs to talk to an ODBII BLE sensors (like this one), and which can then present the car data, or push it to the cloud. Check out the blog post, video and GitHub repo.
- We also have done several variations of an App to support Apps:
- A simple Meetup App where RFID tags are attached to Badge Holders and then are used to track attendance and to run a raffle. Check the blog post.
- Another, more complex version of the RFID setup was used at the ProgressNEXT 2016 event in Las Vegas. The ProgressNEXT application did more sophisticated server-side processing using the AMTech IOT See the post
- The latest version used both RFID tags (using the AMTech Platform) and iBeacons (via a plugin) and was used in the CSUMB Capstone festival.
- Finally, the setup is identical to the one used by Kingslake in the employee transportation project mentioned at the beginning of this writeup. Kingslake is currently using a Native App for the client used by the Android smartphone, but it will be straight-forward to replace it with a more capable NativeScript solution.
- Our TODO pile includes two other projects:
- A Plugin for the Keonn AdvanReader 10, which is a flexible USB-connected RFID reader that can be used from an Android device. This would be very suitable for retail.
- Running a NativeScript application directly inside a ThingMagic Sargas (blog, product page). This would require using the advanced built for NativeScript that is built on top of GitHub Electron Shell.
The future belongs to this class of applications: fully connected in real-time to our world, with substantial computational power – able to run the new AI algorithms, including Machine Learning – and connected to the internet and our enterprises assets. NativeScript is an excellent tool to use in this environment.