NASA Edge | Technology Demonstration Missions Part 3
The future of space tele-robotic communication and operations begins with the K-10 Technology Demonstration Mission. How are astronauts in space trained to control the K-10 robot on the ground at NASA Ames? What are the future uses of tele-robotics in space exploration?
ANNOUNCER: The future of space tele-robotic communication and operations begins with the K-10 Technology Demonstration Mission. How are astronauts in space trained to control the K-10 robot on the ground at NASA Ames? What are the future uses of tele-robotics in space exploration? Grab your remote and find out next on NASA EDGE.
BLAIR: Okay guys, for this episode I’m reprising my role as principle investigator for Technology Demonstration Missions. That explains my seemingly lack of knowledge on the topic we’re going to discuss today.
CHRIS: Did NASA approve this?
BLAIR: Um, pending. It’s pending but I did so well last time. I think it will be no problem. Just like MEDLI, just like my success story, and case study, everything. I’ll be set. Just go about your TDM business and I’m going to be evaluating and asking key questions.
FRANKLIN: Well, Chris & I had a good time out at the Ames Research Center covering the K10 TDM test. It was an early call time. We got there around three o’clock in the morning. Astronauts were awake. Test team was awake. We were wide awake.
BLAIR: We’re you really wide awake at four in the morning?
CHRIS: That’s debatable. It was actually the third test in a series of three tests where Karen Nyberg, who was an astronaut on Station at the time, remotely operated the K10 Rover on the Roverscape.
BLAIR: Had she been the operator on the previous mission or did they pass the baton amongst the astronauts?
CHRIS: They had three different astronauts. Karen was the third in the tests. We had a chance to go out there, luckily, to cover her run.
FRANKLIN: Yes. Chris talked to some of the engineers and software people who work with the user interface that Karen was using on the ISS. I was actually in the control room of at Ames with some researchers as they went through the test. It was a pretty interesting morning.
CHRIS: But first, I had a chance to talk with Terry Fong who is the project manager for the Human Exploration telerobotics. He gave us the top-level approach of the whole service telerobotics test conducted at Ames.
CHRIS: Terry, tell us about the surface telerobotics test that we’re going to be watching today.
TERRY: Today, we’re having Karen Nyberg on the International Space Station remotely operate the K10 robot here in the Roverscape at NASA Ames.
CHRIS: You’ve been testing all summer. Have you been doing the same tests throughout the summer or different set ups?
TERRY: Well, the overall goal for surface telerobotics is to look at how astronauts in space, on a spacecraft like the Space Station, could operate robots on the surface of other planets. We’ve been simulating a possible future lunar mission where you would have an astronaut in orbit above the moon control a robot to do work on the lunar far side. We conducted two test sessions before today. Today, Karen Nyberg is going to help us round out the overall testing.
CHRIS: What are the objectives we are looking at today?
TERRY: Today, we are trying to look at the deployment of a simulated radio telescope. Karen is going to be in charge of laying out a telescope with the K10 rover, then after that, she is going to use the rover to inspect it and document the actual deployed location of the telescope. I think for us the whole technology of telerobotics or remotely operating a robot is something that’s very important for NASA in the future. On Space Station today, we operate robot arms just on the other side of the bulkhead. It’s not that far away. For the future, we’d like to be able to extend the astronaut’s reach all the way down from orbit to the surface.
CHRIS: What does it take to pull a test like this off?
TERRY: This test really involves collaboration between engineers, robotists, scientists, flight controllers, mission operators, really a large team. There’s work being done here at NASA Ames, Jet Propulsion Laboratory, and also, some colleagues at the University of Colorado Boulder. It’s a large team. We’re all working together to understand how we can remotely operate robots and really improve the way astronauts can explore other places.
CHRIS: Tell us the difference between the three different tests that have been conducted here this summer.
TERRY: Surface telerobotics, as a whole, is looking to simulate a future possible mission. This mission would involve deploying a radio telescope on the lunar far side. What we’ve done through our three test sessions is to break that mission apart and test different phases throughout the three sessions. In the first session, Chris Cassidy used the robot to survey the terrain behind me. It’s a simulated lunar environment. During the second session, Luca Parmitano deployed a telescope using a film-based telescope deployed off the back of the robot. Today, during the third session, Karen Nyberg is going to go out and inspect that deployed telescope.
CHRIS: What do you see as the next steps for the surface telerobotics testing?
TERRY: Our testing this summer has really been about collecting basic information to understand how to build this kind of system for the future. Once we finish the testing today, we’re going to spend some time looking at all the data collected over the past three sessions to figure out how we actually design and build a remotely operated robot.
CHRIS: It’s fair to say at some point in the future, maybe in your lifetime, that instead of controlling Curiosity from the planet and sending commands, we could actually have astronauts in orbit around Mars operating a robot.
TERRY: Yeah, that’s exactly right. It’s our goal to enable astronauts to make use of these robots from any place in the solar system, whether they’re controlled from Earth, from orbit or maybe from the surface or habitats having robots work out in the field. It’s really all the same thing; remotely operated robots improve the way we do exploration.
CHRIS: All this technology you are using here today, how does that benefit us in the public sector?
TERRY: I think what really excites me about this technology is really improving the way we can use robots to do work remotely. We know already that on Earth you can use robots to explore the depths of the ocean from the surface. In space, we’re trying to use robots like K10 to allow astronauts to remotely explore other the planets, but, here on Earth, you can use the same approach. You or I could actually work at an office in Houston while living in Hawaii. The whole idea of being able to live in one place and work remotely is what we’re trying to enable using robotics technology.
CHRIS: Is K10 an actual concept that would be used on Mars down the road?
TERRY: K10, for us, is a research robot. It’s got a lot of interesting technology that we hope to one day see in flight. The software that’s onboard, the software that is off board, descendants of those pieces of software may one day end up in an actual flight rover.
CHRIS: Is this really the prequel to R2-D2?
TERRY: You know, I think we’re on a path to eventually have robots like the robots we see in the movies. One of the things that is so much fun about my job is trying to make science fiction into reality.
CHRIS: You’re laying the foundation for the future?
TERRY: We hope so. Maybe not just the pavement, maybe we’re moving along that road together.
BLAIR: I don’t want to immediately bring up questions but since I’m doing an investigation.
CHRIS: Is this really a pre-investigation or…?
BLAIR: It’s a pre- pre-investigation. It’s the very early stages. My first question is you guys claimed you were out at four in the morning.
BLAIR: All I’m seeing is pure daylight in this interview. You’ve got to explain to me what’s going on. We’re you really up early or are you trying to pull the wool over my eyes? I’ve got to know. I’ve got to get the details correct.
FRANKLIN: We were sitting in the morning sun.
BLAIR: Morning sun.
FRANKLIN: Yes, it was just coming up over the horizon. We had to get there early to set up. The team was in place and it was done under the cover of darkness.
CHRIS: We actually started taking video when it was pitch dark outside. The cool thing you need to put in your investigation or maybe ask if you are going to call Terry personally and ask him some questions is the fact that one day astronauts will be operating, maybe not K-10, but other robots whether they’re orbiting Mars, an asteroid, the moon or other planetary bodies. This is paving the groundwork for that.
BLAIR: Yeah. I like the way you phrase that because you’re really getting to the point where you think differently. Usually we think of landers going to a surface and deploying out like this but getting a rover perhaps. Now it’s a rover, it’s other robots. It’s perhaps some other aerial craft that you may be controlling, a lot of telerobotics in the future. My question is then, how are these astronauts, particularly the ones who participated in this test, even prepared to do this? How do you train to run a telerobotic situation while you’re up on station?
CHRIS: He’s the man.
FRANKLIN: That brings us to our second interview with Maria Bualat. She talked to me about this in the multi mission operation center.
MARIA: We had no training on the ground before they went up to Station. We’re doing onboard training or just in time training.
FRANKLIN: Correspondence course?
MARIA: Sort of, actually. They read the manual for the GUI, the crew GUI.
FRANKLIN: What’s this video game they’re playing?
MARIA: It is in a way.
FRANKLIN: Did they really read the manual?
MARIA: Some more than others.
FRANKLIN: You can tell?
MARIA: Yeah, you can tell to some extent. We’ve found that the GUI is very easy to use. We’ve gotten positive feedback.
FRANKLIN: Before we go any further, what’s GUI?
MARIA: Graphical User Interface.
MARIA: It’s the interface, the screen that you work with, buttons and images.
MARIA: They’re finding it very intuitive. They’re finding using the GUI very easy, where the manual helps more with the operational issues like what do I do when I encounter this. It’s not so much that they don’t know how to make the robot do things, it’s more of what do I do in this situation?
FRANKLIN: If I were to sit down at a console and you put these controls in front of me and said operate the robot, would I be able to do it just by looking at the screen? Are there any kind of onscreen directions?
MARIA: Pretty close. Do you have a DVR?
FRANKLIN: Yes, actually I do.
MARIA: Well, the controls for running plans on the robots, the task sequences are basically like DVR controls. There’s a play button. There’s a pause button.
FRANKLIN: Idiot proof?
MARIA: Pretty close.
FRANKLIN: Okay. Alright.
MARIA: The robot actually saves itself. Even if you tell it to run into a rock, it will refuse to do so. The robot says, uh huh, there’s an obstacle there. I can’t go that way.
FRANKLIN: Speaking of running into a rock, while we were at Roverscape, we found out about the 3D laser on top. Tell me a little bit about that.
MARIA: We use both the laser and stereo cameras the way we as humans see 3D with stereo eyes. That gives information about the shape of the terrain. Where are the lumps, bumps and slopes? We depict that in the virtual 3D view that she has in the GUI. We show the shape but we also color code it. If it’s nice and flat or a very small slope, it will be green. If it’s a little bit tricky, it might be yellow or orange. If the rover will not drive over it or it’s dangerous, it will be red. That’s how when she’s driving the robot around she can tell whether it’s safe to drive into a certain area or not.
FRANKLIN: Your Technology Demonstration is complete. Where to next?
MARIA: That’s actually a good question. For the next few months, we’re going to be analyzing our data. We asked the crew a lot of questions. We want to look through all of that and see the feedback they’ve given us. We will also want to understand some of the more underlying technologies. We’ve collected a lot of communication data between the robot and the Station. We’ve looked at button presses in the GUI; rover telemetry, as far as the rover’s health and how it performed. We’re going to be analyzing a lot of data over the next couple of months.
FRANKLIN: Maria, you’ve been up since 2 or 3 o’clock this morning.
FRANKLIN: The test is over. I think you deserve a vacation.
MARIA: I’m finally going to take one. One of the things about working with the ISS scheduling is you’re not absolutely sure when your sessions are going to be scheduled. I didn’t dare take vacation this summer because I didn’t know when we’d end up having the session. I’m definitely going to start taking vacation now.
FRANKLIN: So Blair, you’re a gamer, right?
BLAIR: Absolutely, without question.
FRANKLIN: You know what, Maria made this sound so easy and the user interface so intuitive that even you could pick it up. Because I know you never look at a manual when you open up a new game.
BLAIR: Yeah, but you see that argument is a flawed argument. I don’t have to read a manual because I’m so good. It’s not based on the simplicity of the game. It’s because I’m a natural gamer and I’m talented in that realm.
FRANKLIN: There’s a lot of swag coming from you right now.
CHRIS: You’ve got to realize there’s a big difference between controlling a K10 rover from ISS or a planetary system as opposed to playing the Atari 2600 eight hours a day.
FRANKLIN: You know what? I did see an old unit the last time I was over at your house.
BLAIR: It’s true but seriously I think it’s a good point. Part of the reason it has to be simple is, like we talked about earlier, they don’t have a lot of training or access. You don’t spend hours preparing to drive this rover. They want something they can plug and play.
FRANKLIN: The difference between operating a rover on the surface of Mars and working on the ISS, the astronauts on the ISS are continually working on different projects. They just got handed this one the morning of. Karen got that the morning of. She opened it up and made it happen. If you’re going to train for a mission to Mars, you will definitely put some more time in it.
BLAIR: That’s a good point. Obviously, NASA does a lot of complex things. Not everything they do is simplistic. Is the reason for simplicity now have to do with the fact you’re developing a platform and the stakes are so high in terms of using ISS resources and putting these ground resources into play at a particular time?
FRANKLIN: I think it’s just keeping it simple for the sake of keeping it simple. Nobody wants to decipher the Da Vinci code while they’re orbiting Mars. Really. If you want to pick up and operate a rover that’s going to be on the surface of Mars make it work like a video game. Keep it simple.
CHRIS: If they can test the software and make it work successfully, you can apply it to a wide range of robotic vehicles, not just rovers.
BLAIR: Do you think for my investigation, I should be able to download this software or at the least put it through its paces on my Atari 2600, if you will?
CHRIS: Yeah, if it’s a basic. Sure. We also had a chance, Franklin, to meet with the Ops side. We met with some engineers and researchers who were up with us at 4 o’clock in the morning.
CHRIS: Susan, are you tired.
SUSAN: Yeah, very tired.
CHRIS: You’re the engineering lead for the test. Tell us what happened early this morning at 4:00 a.m. when you started working throughout the test itself.
SUSAN: At 4:00 a.m., we start one rover at a time. We start up the controller, which is the software that runs the robot.
SUSAN: It’s dark outside so there are some things we can’t test. We can’t test the cameras and navigation but there are certain parts we can test, like make sure it can drive. We get the first rover all set up and taken outside, then we move to the second robot and run a couple simple tests to make sure we’re connected to the rover. We’re getting the data. Then, it’s listening to the loops, the space to ground loop, which is when the astronaut talks to mission control and try to anticipate or see any problems before maybe even they notice it, so we can have a fix for them as soon as possible.
CHRIS: How do you even get the signal from the International Space Station down to the K10 rover?
JAY: It goes to a lot of different places. From the ISS, it went to JSC, which is the Mission Control. From there, we had a router that actually routed the message from the ISS to here, at Ames to the K10. The message later that it was using was something that we built here called RAPID and allowed us to write our software and communicate and let Karen control the robot here at Ames.
CHRIS: With all that, what kind of delays are we talking about in terms of Karen operating the rover to actually seeing the rover move?
JAY: Surprisingly, we were expecting quite a bit of a delay but it ranged between a second to maybe even less then that. It wasn’t that long. Although, we did have a video feed and we could see the delay, where we’ll see the messages here and then it will appear on her workstation.
CHRIS: Basically, what you’re saying is it’s faster to send a signal down from the ISS then it is for me to get on the Internet to go to some of these websites?
JAY: Yes, definitely.
CHRIS: During the actual tests, there would be times where it would go maybe 4 or 5 feet and stop. Then you’d have a one to two minute period where it was stationary and then it would go again. What’s going on during that lag time between runs?
SUSAN: Usually, it’s taking a camera image. We have a downward facing camera called an inspection camera. When we’re deploying the film for example, we like to take pictures every couple of meter or two, so the astronaut could tell us there’s a tear in the film or this doesn’t look right. It didn’t deploy properly. Other times, it’s taking a panoramic. It’s trying to get a bigger picture view of did it lay out the film correctly? Are there interesting features in the view?
CHRIS: Besides the pictures, Karen can see that in real time with the cameras.
SUSAN: The hazard cameras or navigation cameras which help the robot see obstacles…
CHRIS: The two cameras at the very top?
SUSAN: Yeah, the eye. So, she sees that feed in real time. There’s also other cameras that are higher resolution and in color. Those are only taken when we ask.
CHRIS: Is she taking those stills herself?
SUSAN: Yes. In the plans there are certain points called waypoints. At each waypoint, you might have a command to take an inspection image or a panoramic image. If all things go as planned, it will go to those stations and take those images. Sometimes it will take an image and you’ll be this isn’t the image it wanted. It’s not pointing the direction. Then it’s Karen, the astronaut’s responsibility, to maybe rotate the robot a little bit to take the image the ground requested.
CHRIS: What do you see as the next steps for this?
JAY: I’m hoping in future missions that we cannot only control K10 but other robots that are developed by NASA or anyone else. Any robot that uses this framework will be able to be controlled in a unified software. Hopefully we can get some future mission so that the astronaut cannot only control K10 but also ATHLETE from JPL, maybe the MMSEV from JSC and some of the other robots from different centers. Not only can it be used in various ways for space exploration but maybe in the medical field for remote controlling of a robot. The doctor is in the United States and he has patients in other places; maybe even deep sea diving. There’s a lot of application we can see for this in the future.
MAN: Take the turn, turn, turn, turn.
WOMAN: Go for the rock.
CHRIS: We thought for a second that Karen was doing some 4-wheeling.
SUSAN: Yes, the K10 can go over small rocks about half the size of the wheel. It can actually go over bigger but we don’t let it.
CHRIS: How did it go today from your perspective?
SUSAN: I think it went really well. We always expect there to be small problems but we developed a bunch of contingencies. If this happens, this is what we’re going to do. We’ve been preparing for a really long time and I think it paid off today.
CHRIS: All the engineers and technicians that worked that mission were pretty young.
FRANKLIN: Yeah, they’re of the gamer generation. They know how and what it is, what it feels like to work with a software that is easy to use. I think that was one of the things they were trying to do with the astronauts in this TDM.
CHRIS: I think you need to keep in mind with that report is look at the age of these engineers. They’re not 50 or 60 year old folks. These are young people, just fresh out of college or 5 or 6 years out of college.
BLAIR: Chris does not like older…
BLAIR: No, it is an interesting point. I think part of that is just the emergence of the technology and the lifespan that it has. It seems like even though this demonstration is going to help NASA in the short run. You see tons of long-range applications for it.
BLAIR: I’m just wondering one day am I going to be sitting there gaming playing my particular console and decide I want to take a flight around Mars and I’ll control a drone or some kind of rover and do my own work as an average person.
CHRIS: Or better yet, you’re a student in a 5th grade class and today’s topic is robotics. Every kid is operating their own rover on another planet.
FRANKLIN: From their I Pad.
CHRIS: From their I Pad.
BLAIR: I Pad? Are you kidding me? Those will be passé.
CHRIS: You’re watching NASA EDGE.
FRANKLIN: An inside and outside look…
BLAIR: At all things NASA if they pass my investigation right away. I don’t want you to lose the impact of this significant moment for me. Bear that in mind.
MR. T: Quit your jibber jabber.
CHRIS: What was that?
BLAIR: What was that?
FRANKLIN: Hold up. What was that?
CHRIS: It was my Mr. T in the pocket.
(c)2014 NASA | SCVTV