Coding for robots: Need-to-know languages and skills | Code Skills

KODA advising CTO John Suit discusses the skills and languages that are important for developers who want to build software and systems for modern robots.

Robots come in all shapes and sizes, but typically, they’re autonomous devices that operate on their own to help us complete a task. If you’re a developer looking to get into the growing field of robotics, what are the right skills to have? What languages should you know? In this episode of Dynamic Developer, we’re going to talk with John Suit, advising CTO of KODA, who can answer these questions and more. The following is a transcript of the interview, edited for readability. You can listen to the podcast player embedded in this article, watch a video above or read a transcript of the interview below, edited for readability.

Bill Detwiler: So, let’s talk robotics. We’ve seen a lot of developments recently in 2020. We have the Boston Dynamics news around Hyundai, looking to acquire them. Then, on the positive side of things, a recognition that robotics is really taking off. On the flip side of that, you have some announcements in November 2020 around Walmart ending one of its partnerships with Bossa Nova Robotics. It seems like there’s a lot happening in the field of robotics right now. From your take, what can we expect to see in the field in 2021 and then beyond?

Must-read developer content

John Suit: I agree–there’s a lot going on in robotics. My background is actually in cybersecurity for the last 20-25 years, but also in decentralized artificial intelligence, and big information analytics systems, which becomes really, really important to robotics. When I was approached by KODA to take a look at their super cool robotic dog, it became pretty apparent that the things I was interested in were the things that you’ll need on a robotic platform, if that platform is going to interact with people and it’s going to learn. One of the first things I had asked, and a buddy of mine is one of KODA’s co-founders, and said, ‘Hey, John, you got to take a look at this.’ He and I had worked together in several different companies and he kind of knows what I like and what I don’t, or what I choose to work with and what I don’t.

He said, ‘You’ve got to see this dog.’ First, instantly I thought of the YouTube robotic dog Boston Dynamics video. So I asked him right off the bat, I said, ‘Does it have a head?’

Bill Detwiler: It’s not that creepy looking, is it?

John Suit: I instantly had visions of this militarized robotic dog thing. And he said, ‘No, no, no, no, no, no.’ He sent me a picture of it, and it’s this metallic blue, very cute looking, but very capable, robotic dog. I said, ‘Well, what are you going to use these for? What’s the application?’ He told me the first application that they’re going to use them for are visually impaired kids, to help not just learn from real organic dogs, but to help learn the environment–these are true learning systems.

” data-credit=”Image: KODA, Inc.”KODA robot dog

Image: KODA, Inc.

So I think what you’re going to see in 2021, it’s not just the advancements in motors for walking robot’s gait–which is one of the hardest things to accomplish for a robotic anything. You’re going to see a variety of sensors that are coming into play which weren’t available. Think about it like the camera array that’s on your Roomba vacuum cleaner is VSLAM. That’s pretty advanced. Then you have LIDAR systems that are very good at depth, and very good at building an array of images for something. All these things, all these sensors, audio sensors, directional sensors, come into play when you’re talking about robots. To put all that together and actually do something with it, you’re talking now decentralized artificial intelligence. You have to have a decentralized file system to deal with that–blockchain has made great advancements in that area.

SEE: Robotics in the enterprise (free PDF) (TechRepublic)

I think a lot of people, they hear blockchain, and they think digital currency. That’s true, but it’s also a great way for a lot of things that are all learning something to report in to a general ledger, so they can share information in order to all learn how to do something. Let’s say I had a KODA that was in New York City and it was helping a visually impaired kid navigate the streets and stop at intersections and things like that, and understand when a siren goes, understands the direction of an ambulance that’s coming and knows to stop on the curb. That kind of stuff. When a KODA in Aspen learns how to walk in the snow, and then that KODA that was in New York goes on vacation to somewhere where there’s snow, for example, they’ll have that knowledge because the other KODA learned it. That’s the kind of thing that I think is particularly cool.

Building a robot that shares information with other robots

Bill Detwiler: I’d love to drill down on that a little bit and talk about those technologies that are making robotics really possible in ways that we didn’t think about just even a decade ago, because I think most people think about how we’ve got automation, we’ve had that in factories for decades. We understand that. We understand how robotics maybe can work in warehouses and distribution. We understand how robotics works in drones. I think a lot of people also understand how there may be a client-server relationship between a robot and the cloud. But, there’s still this concept that you talked about as distributed computing, and being able to do processing on the devices. And then not just share that information up to a central location, but actually to have processing done on the devices that then is shared with other devices. That part of the overall system is kind of new, at least in the robotics field.

John Suit: It’s very new. The whole concept of decentralized artificial intelligence gives you the benefit of learning something that the rest of the devices, or peer devices, may not have the opportunity to learn, simply because they may not have the terrain, or they may not have a particular circumstance just like people. One of the things you have to do right off the bat to deal with this stuff is, obviously you have security. You have to give permission what to share, what not to share. You have to have infrastructure, data in-motion, data at rest–all of that needs to be taken care of. Once you have that all figured out though, it really opens up the advantage of what you’re saying is I can now not just share things that I learned, but I can learn things differently, even in similar environments because of this concept called ephemeral memory.

SEE: Hiring Kit: Robotics Engineer (TechRepublic Premium)

If you can get any robotics to this level where they learn something, and then I force them to forget and to learn it again, then if you keep doing that, you end up with many different ways to solve a problem. You find out the optimal way to solve a problem for any given condition, which is very similar to how we work. That’s the idea. You have a robot, in this case, I got interested in a robot dog, that once it learns how to climb stairs, I want it to forget how to climb stairs, because I want it to reason another way. I want it to reason another way again, and then again. Then what you get is a better and better robot dog over time.

When the dog gets to a point where it can share that information of all this aggregate learning to other dogs, you have this refined knowledge at that point. Maybe not wisdom yet, but you definitely have knowledge that it can share. That happens because of the ability to reason on the device. Everybody thinks of rules engines and inference engines, but there’s true reasoning that can take place–it happens because of the variety of sensors. Obviously, there are many GPUs, there’s lots of memory, there’s a lot of processing power on these devices and on these robots, but it’s having the availability of what’s happening in the real world being brought in by the sensors. In some cases, more than we have. We hear and see a very narrow bandwidth of the energy around us. They have a little bit higher aperture. In some cases they have more information, in some cases they have less. But, it’s the ability for them to take that information and mesh it and reason and all that, that you ultimately get something that can really work well in multiple different environments.

KODA’s robotic dog

” data-credit=”Image: KODA”asset-koda-product-for-release.jpg

Image: KODA

Bill Detwiler: How critical is that to the development of robotics in general? I mean, robotics need to be able to serve a specific purpose. I think when you say ‘robot’ to person, they think of the sci-fi version that we’ve seen for generations, going back to automatons. We think of a general purpose robot that looks humanoid in shape and can do everything. And that doesn’t seem to be the way… Although there have been some advances in robots that do look that way, but it seems to be that we’re having success with robots that are maybe more specialized in what they look, and the functions that they do. Where do you see that going? How specialized do you see robots being? KODA is a dog. It looks like a dog. It functions like a dog. But you’re talking about a purpose that isn’t just for a pet, right?

John Suit: That’s right.

IoT can be a roadmap for purpose-built robots

Bill Detwiler: So we’ve all seen the toys that you would call toys that are dogs, that are designed to be for entertainment. But what you’re talking about is a much higher level use and role for this device to play.

John Suit: Yeah. You’re right. It is a very high-end computing platform. And it’s not just a computing platform, it’s a very high-end reasoning engine. Let’s say you have a KODA and it needs to recharge, and you want it to a mine for digital currency, that’s the kind of processing power we’re talking about. So it literally makes money for you while it recharges and sleeps. It’s that level we’re talking about. And it’s a good point. Purpose-built robotics or even purpose-built IoT devices. I mean, there’s huge advancements in IoT just recently. Look at what happened at MIT with their MCU-Net.

They figured out a way, because you have such limited resources on IoT devices, and they figured out a way to optimize the engine, basically the search space and optimize the engine. So that you’re not hindered by that limited availability, because algorithmically it can adjust to whatever it has available at that moment in time. And I think what they will figure out, or maybe they’ve already figured out, is that they’re going to ultimately be able to do that dynamically. At least what’s available that they figured out is now they don’t have to tailor it for every platform, which is amazing.

SEE: 5 Internet of Things (IoT) innovations (free PDF) (TechRepublic)

But I think they’re also going to then very quickly figure out now they can adjust their algorithms dynamically as resources become available or are no longer available, and so it can tune itself on the fly. And that’s when that’s when you can start to mesh IoT devices, and you get all this stuff and all that kind of stuff. So to your point, yeah, I think you’ll see some robots that are specialized, but I also think you’re going to see robots that can do a lot of different stuff, and they can work together. Is that only possible through some type of distributed computing?

John Suit: Yeah.

Bill Detwiler: An AI driven system that just wasn’t possible in the past?

John Suit: Yeah.

AI is crucial for next-generation robots

Bill Detwiler: I think back to the first

Coding for robots: Need-to-know languages and skills

Post a Comment

Previous Post Next Post