CloudBees DevOptics at DevOps World | Jenkins World 2018 (from CloudBees Blog)

This is an indirect link to the official post.

Last month I attended our annual DevOps World | Jenkins World 2018 in San Francisco. The event has gotten bigger every year, and – spoiler alert! – next year the conference will be at the San Francisco Moscone Center.

CloudBees DevOptics was an integral part of the event and this post will highlight new updates and features to the product. But first, I’ll give you a recap of what happened in San Francisco…

The rest of this HERE

Advertisements

It’s Friday Night – Do You Know What Your Code is Doing?

(this is a reprint from the post at Blogs.CloudBees.com from Sept 6th, 2017)

Marc Andreesen wrote back in 2011 that Software is Eating the World, and nowadays business execs everywhere, from GE to Ford to ABB, all are saying that their companies are really Software Companies.

SoftwareIsEatingTheWorld-2

Hold that thought…

Now look at a typical process for creating, testing, and releasing Software.  What you see is a multiplicity of individuals and groups collaborating to coordinate how ideas are converted into code that is tested, integrated, deployed, measured, and evolved, in faster and faster cycles.  And companies – all companies – are investing heavily to improve these processes to gain a competitive advantage, or just to remain competitive!

Jenkins

Next, peek at these Software Processes and you see automation and improved information flow and coordination as the key enablers.  And our friend Jenkins is one of a handful of tools that are Everywhere (in all companies) and Everywhere (in all places inside a company).

My employer, CloudBees, is the home for Enterprise Jenkins.  CloudBees Jenkins Enterprise and CloudBees Jenkins Team address running Jenkins in your Enterprise…

CB-Jenkins-leftCB-Jenkins-right

But!

What is missing is how to connect together all these automation engines, and other agents in these Software Processes, so you can gain insight on and improve upon your Software Process…

So, today, CloudBees is announcing CloudBees DevOptics!

DevOptics Logo

CloudBees DevOptics connects all the islands of software development and provides you with a global view of your software process.   We, of course, connect to all your CloudBees Jenkins instances, but we will also connect to Open Source Jenkins instances; and more.

DO-Placeholder

The screenshot above shows a very simple Software Process involving 3 Jenkins instances, used to create two Components (Plugins A and B) and then a combined Artifact that integrates those two components.

DevOptics collects the information from these 3 instances – they may be anywhere in your organization – and makes sense of the changes that are flowing through them.   The screenshot shows the tickets that have gone through the system – in this case filtered to the last 14 days, and shows where those changes are within these 3 “Gates”.  Some changes are still in the Components, others have gone through the integration – or perhaps are being tested right now.

As you can see, DevOptics can show you details about these changes.  We connect back to the defect tracking system – JIRA in this case – to get the current data on those tickets.  We also provide, but is not shown above, a connection back to the code repository (GitHub in this case), and, of course, to the automation engine itself, the Jenkins job.

In future blogs I will talk more about DevOptics.  In the meantime, you can check out today’s keynote announcement (video) and also the booth demo we are doing at JenkinsWorld (video).

VIDEO Screenshot HERE

Today is our announcement; we will launch later in the year.  Go and contact our friendly CloudBees Sales Team to engage with us. DevOptics is a SaaS product, so we will iterate quickly to add features and to accommodate feedback and new customer needs.

And remember, all Companies are Software Companies!  Regardless of the industry your company is in, you can benefit from CloudBees DevOptics! 🙂

Yes, we are taking over the world.  Go Team!

 

Jenkins World 2017

Its been a long run since the first Jenkins User Conference, back in October 2011, at the Marines’ Memorial Hotel and ClubJUC2011

The 2017 edition, now called Jenkins World, is at the end of this month, now at the much bigger SF Marriott Marquis Hotel, at 780 Mission Street.

sfodt_main01_r

Ping me if you are around, attending, or just want to chat.  At this point I’m planning to be there for the actual conference, on Wed and Thu; whether I’ll be there Mon or Tue, workshop days, will depend on the usual Gods of Software 🙂

See you around!

And now at CloudBees

Cloudbees I’m overdue for an update on my work status.  I started working at CloudBees in March but I’ve been going full speed since then and I’ve not had much time to write.  JenkinsWorld is at the end of the month, so we are not exactly slowing down, but I want to get my update before then.

Progress Software announced a New Strategic Plan during its FY Year End report (link) and: “Progress intends to reduce headcount by approximately 450 employees, totaling over 20% of the Company’s workforce.”  So, I went looking, found different options and… CloudBees, I choose you!

There are many reasons for choosing CloudBees…

  • CloudBees is the commercial home for Jenkins (home, wikipedia), the ubiquitous Automation Server,
  • Software is Eating the World  and Jenkins is the main automation engine driving this,  There are many great opportunities around CloudBees and Jenkins…
  • I know, have worked with, and I am friends with many of the CloudBees folks.  Kohsuke, the creator of Hudson/Jenkins used to work for me,  I have worked with HarpreetVivek,  Dave, Alyssa, John (and others?) at Sun, with Steve at Oracle, and over the years I’ve been at the other side of many chats with Sacha and others.

and…

  • I had an opportunity to try a new role, engineering manager, with a great team, in a great product…

So, here I am, at CloudBees.  I work from the San Jose office, by the SJO airport

IMG_1277.JPG

Kohsuke and Harpreet are also based there, and so is Womby (Why Wombats?)

IMG_1118 2.jpg

We have a great team!  And, and we have some openings.  We need:

Ping me if you want to work with me and us.  As somebody said: Kick Butt and Have Fun!

34909277642_4f554e7319_z

My Favorites from CES 2017

I spent a couple of days at CES 2017 last week.  Two days is tight for CES and I missed a few things that were in my to-visit list but it was still totally worth the trip.  A selection of my photos are in Flickr, but here are some highlights:

Most Guts – Prosthesis

IMG_9517 2.jpgThis is an exoskeleton designed for racing.  It has four mechanical limbs controlled by the racer within, in mostly prone position, one per leg and arm (I uploaded a video of the movement).  The project is now part of Furrion Robotics; the vision of the designer (Jonathan Tippet) is a racing league.  The current version was completed just before CES 2017 and still does not have the actuators on it but here is a 1/3 scale version of the left arm actuator:

IMG_9523.JPG

It’s very impressive and feels properly nutty… in photos and even more in real life.  I created an album just for it.

Also check out the original web site, the Indegogo project, and the Gizmag/New Atlas article.

Geekiest Demo – Qualcomm Drive Data Platform

Qualcomm had a very impressive demo of their Drive Data Platform.  They use a Snapdragon 820Am with a camera, a fast (Cat 12) modem, and a neural network app using the Snapdragon Neural Processing Engine.   Here is the (simple) camera setup:

IMG_9564.jpg

DDP uses the camera and SNPE to recognize the objects around the car in real-time, especially buildings and the such that bounce the GPS signals and limit the accuracy of the GPS.  Then DPP can filter out all but the direct signals from the satellite and it can determine the location of the car very accurately and quickly.   Here is a screenshot showing all the buildings that are being detected in real-time by DPP.

IMG_9562.jpg

The (very accurate) location can then be used, again with computer vision, to determine the location of other elements in the field of  vision of the camera, like the lanes in the freeway.  This information can be shared with the map system.  And can be crowdsourced!

Repeat all this, and you can get very accurate maps very quickly.  I asked how large a sample they would need to do this and the presenter indicated that a large manufacturer would be able to do this by itself.   Think about the implications for the mapping ecosystem!

Here is the Qualcomm Blog Post on the DDP.  I’m sorry I didn’t take a video of the whole demo, but it was very impressive.  I think Qualcomm is doing a bunch of things right.

AI at CES

The main reason I went to CES this year was to check out all the work on AI / ML / Neural Networks.  I believe that in a couple of years we all will be using these things routinely in our apps.  Some of it will be at-the-edge, some will be on the cloud.  The DDP is an example of this.  The Snapdragon Neural Platform Engine (SNPE) was being demoed by itself elsewhere in the Qualcomm booth and it was very impressive.  Its designed so it can leverage the CPU, the GPU and the DSP on the Snapdragon, very neat and fast. Very interesting times ahead.

Coolest Area – Eureka Park

The first floor of the SandsExpo contains the Eureka Park.  I think this year, CES did a great job with it.  It had areas for different types of startups: early-stage, mid-stage, University-backed,  different non-US locations, Indiegogo, etc.  It was very busy, with some very interesting ideas and some not-so, chaotic and fun.  Quite different to the more organized floors elsewhere at CES.  Here is the map:

IMG_9586.JPG

The area had a bunch of things.  Something that caught my attention was the BioMindR – they leverage wireless signals and machine learning to continuously sensor hydration, glucose and fluid levels without contacts. It reminded me, at a very different level, of the work at MIT that is behind Emerald.

IMG_9600.jpg

There were many other interesting booths.  I particularly enjoyed talking with the Sensel folks but there were many more.  The Beon camera was also fun – I’m not convinced as a wearable in the wrist, but there should be a good fit for it somewhere.

Et Cetera…

I’ll leave you with some more pictures:

Can you figure out how does this work?  Click on the image to see the video clip.  The hint is: the demo was in the Nidec booth, they specialize on motors, bearings, robotic transporters, etc, etc.  Their motors are in theAutel Robotics Drones (very nice, they announced a deal with FLIR on dual thermal/visual cameras), and in many others.

742804DD-E8D6-45F3-AA39-85F0D3DF3A1B.png

And, from the very large and visited Xiaomi Mi booth, to show that they are serious, Mr. Hugo Barra:

IMG_9488.jpg

Xiaomi was testing the waters on them coming to the US.  The prices were amazing.  I would not buy one of their phones but they had plenty of other things I’d consider purchasing, especially this electric foldable bicycle – listed at the booth for $430!

IMG_9486.JPG

Ah, and the LG 4K OLED monitors were excellent.  I’m not a monitor guy but they were very very impressive!

Bluetooth 5.0 is beginning to show up and so is Thread.  Nordic’s nRF52840 is “ready” for both standards.  Its hard for me to predict the traction for Thread, but I expect we will see Bluetooth 5.0 everywhere soon.

Worth The Trip

Totally worth the trip to Las Vegas.  As in previous trips, the best thing are always the conversations with the booth guys.  Special thanks to the people from Qualcomm, Intel, Pikazo, Nordic, Beon, Nidec, Shenzhen Minew, Prosthesis, AppMyHome, Sensel, Orangie, RetailNext, Autel, ChargePoint and many more.

And check the Flickr album if you want to see more picts and additional commentaries.

NativeScript and Modern Sensors and Instruments

NativeScriptAndInstruments-Images.jpg

Modern applications are increasingly leveraging a multiplicity of sensors to connect the physical realm to the traditional software realm to gain multiple benefits, from efficiencies of operation, to security and safety, to new functionality.

Here are some real-live examples:

  • Sensors like RFID Tags allow Zara, the largest retail apparel in the world, to track their inventory accurately and in real-time, from the time it arrives in the trucks to the time it leaves with a customer, providing asset control, inventory availability, smart sale recommender, connection to social media, automatic reorders, and more.
  • An Industrial Equipment Vendor like TVH can track use data from GPS and ODBII sensors to track cars, their use, speed, acceleration and deceleration, gas consumption, battery status, and reduce operating costs from energy to maintenance to insurance.
  • Kingslake, a Progress customer in Sri Lanka, provides an application for managing the transportation of employees to factories using independent mini-bus operators. The employees use NFC tags to identify themselves to the operators which read them using rugged Android devices that can then use GPS data and connect to the backend to validate routes, provide accountability, even notify the factory if an employee won’t show up for their shift.
  • An event, like our own ProgressNEXT or the CSUMB Capstone Festival, can use a combination of RFID tags and readers and iBeacons, mediated through Mobile Applications, to provide personalized event information as well as tracking, logging and authorizing event data.

In the above examples, the server side services can be created on top of platforms like Rollbase, Modulus and the Telerik Platform, using data services like Progress Open Edge, and many SAAS services, and typically will interact with the clients through internet standards like HTTPS and Web Sockets.  The sensors themselves are part of, or interact with, mobile devices like smartphones and smart watches, and fixed devices like readers and instruments, like smart door locks, access control readers, etc.

There are a few key OS platforms for these devices.  The smallest sensors still run in “traditional” Real Time Operating Systems.  A number of handhelds and instruments used to run on Windows CE  but Microsoft has ended its life transitioning to Windows 10 IOT, but we many mobile devices are increasingly based on Android, while fixed devices are moving towards Linux – and Android Things was just announced.

NativeScript is particularly well-suited to this space.  NativeScript today runs well on iOS and Android (see note below) and provides very efficient and timely access to any platform features, including new libraries created to access the sensors.  The NativeScript metadata-driven machinery exposes these libraries at the JavaScript level and then they can be wrapped through a Plugin into a cleaner abstraction. This means that JavaScript and CSS developers can then write applications leveraging these sensors.

Note – Beyond iOS and Android, Progress/Telerik also has been doing work on the NativeScript port for Windows Universal (for Tablets and Desktops), and there is a separate project on porting it to Linux leveraging GitHub Electron Shell.

Here are some links to recent projects that are built on this setup:

  • The Invengo XC-1003 is an Android (KitKat)-based mobile RFID reader. The RFID reader is available to Android apps but Mehfuz wrote a NativeScript Plugin on top of it, and then a simple App using it.  Check out the blog post, the video and the GitHub  repo.  The App mimics a retail situation and is actually using 4 plugins: a RFID Reader, a iBeacon Reader (also written by Mehfuz), Google Maps, and SQLite for local storage, as well as checking with a (node.js-based) service for content.
  • An App for Fleet Management that uses the Bluetooth APIs to talk to an ODBII BLE sensors (like this one), and which can then present the car data, or push it to the cloud. Check out the blog post, video and GitHub repo.
  • We also have done several variations of an App to support Apps:
    • A simple Meetup App where RFID tags are attached to Badge Holders and then are used to track attendance and to run a raffle. Check the blog post.
    • Another, more complex version of the RFID setup was used at the ProgressNEXT 2016 event in Las Vegas. The ProgressNEXT application did more sophisticated server-side processing using the AMTech IOT  See the post
    • The latest version used both RFID tags (using the AMTech Platform) and iBeacons (via a plugin) and was used in the CSUMB Capstone festival.
  • Finally, the setup is identical to the one used by Kingslake in the employee transportation project mentioned at the beginning of this writeup. Kingslake is currently using a Native App for the client used by the Android smartphone, but it will be straight-forward to replace it with a more capable NativeScript solution.
  • Our TODO pile includes two other projects:
    • A Plugin for the Keonn AdvanReader 10, which is a flexible USB-connected RFID reader that can be used from an Android device. This would be very suitable for retail.
    • Running a NativeScript application directly inside a ThingMagic Sargas (blog, product page). This would require using the advanced built for NativeScript that is built on top of GitHub Electron Shell.

The future belongs to this class of applications: fully connected in real-time to our world, with substantial computational power – able to run the new AI algorithms, including Machine Learning – and connected to the internet and our enterprises assets.  NativeScript is an excellent tool to use in this environment.