Game Changing Development Program Manager Steve Gaddis joins NASA EDGE in studio to discuss robotics, rovers and perhaps an upgrade at co-host.
Game Changing Robotics
CHRIS: Welcome to NASA EDGE.
FRANKLIN: An inside and outside look at all things NASA.
CHRIS: We’re joined by Steve Gaddis, who’s the program manager for the Game Changing Development Program Office and our special co-host. How are you doing Steve?
STEVE: Doing great. Glad to be here.
CHRIS: Now, last time we had Steve on the show we were talking about nanotechnology, which was pretty cool.
FRANKLIN: Very cool.
CHRIS: What are we going to talk about today?
CHRIS: Oh, awesome.
STEVE: We love them, don’t we?
FRANKLIN: We really do love them. Tell us exactly what robot you have in the Game Changing portfolio.
STEVE: You’re probably most familiar with R2, Humanoid Robot that we’ve got on station. R5 is the next generation. Have you seen R5?
CHRIS & FRANKLIN: Yes.
CHRIS: It looks like Iron Man to me.
FRANKLIN: It really does.
STEVE: Iron Man, you better believe it. That’s what we were hoping for. We’ve also got a robotic arm and it’s being developed at Langley. It’s one of those long reach arms, so if you get close to an asteroid it can reach out and take a piece of it off, or attach and bring the vehicle to it. A lot of this work is being done at Johnson; some is being done at Ames. Speaking of Ames, have you heard of Astrobee?
CHRIS & FRANKLIN: No.
STEVE: What about SPHERES?
CHRIS: Oh yeah, we did a segment on SPHERES a couple of years ago.
STEVE: That’s right, you did. And so, Astrobee is the next generation of SPHERES. It’s like SPHERES on steroids. It’s going to do a lot of cool things to help the astronauts on the inside of the station and we’re hoping it can do some observation on the outside of the station. Oh hey, one moment please.
CHRIS: Sure. It sounds like you have a lot of different robotic vehicles within your portfolio, from humanoids to regular robots.
STEVE: Absolutely. We’re covering robots that are humanoid in fashion. We’re covering robots that operate spacecraft. We’re covering robots that are a rover, robotic rover. Something like even a vehicle, if you will. Thank you, B.
CHRIS: And I understand that you have another robotic vehicle called Resource Prospector?
STEVE: Absolutely. RP is going to be going to the moon and we’re developing the rover; we’re partnered with HEO for that activity. It’s going to take samples; it’s going to gather data. It’s been a long time since we’ve been on the surface of the moon. It’s very exciting.
CHRIS: Thank you for giving us the opportunity. Franklin and I had the chance to go down to Johnson Space Center and actually see those robotic vehicles in action.
STEVE: Yes sir.
CHRIS: We’re going to start out with the first one which is the Resource Prospector. I had a chance to talk to Bill Bluethman who’s the project manager for RP.
CHRIS: Hey Bill, we’re inside Building 9 at Johnson Space Center. There’s a lot of activity going on this morning. I see a lot of different robotic parts, which leads to Human Robotics Systems. What’s that all about?
BILL: Well, so Human Robotics Systems, our goal is to build robots that help humans explore, and that can mean a lot of different things. It can be the kind of robots that do missions before astronauts that enable future exploration. It can be working shoulder to shoulder with astronauts during a mission, and it can be cleaning up after. It has a very broad set of areas where we can apply this work, but the ultimate goal is to make human exploration more effective.
CHRIS: I understand you have a number of activities within HRS. What do you have?
BILL: Right now we’ve got Rover technologies where we’re building a small rover to explore the poles of the moon.
CHRIS: I think that’s it right now, isn’t it?
BILL: Yeah, it’s testing it about 20 feet from us. We’re testing the suspension. The ultimate goal of that machine or that mission is to look for water. We’ve had recent missions, orbital missions and impact missions that have shown that there is in fact water at the poles of the moon. The goal of this mission is really to touch it, process it, understand just how much it is horizontally across the surface as well as what it looks like sub-surface.
CHRIS: What you’re basically doing there for the Resource Prospector, you’re taking the data from LCROSS and LADEE and those previous science missions, and now you’re going to apply them.
BILL: Yeah, it’s kind of the next step in really getting at the water that’s on the pole. This is an interesting project where, within Game Changing, we’re partnered with the AES, Advanced Exploration Systems program, where the rover team is really developing the technology. We’ve built a prototype this year. And once we’ve developed the technology, we’ll hand that work over to AES to really do the flight work.
CHRIS: Is it going to be that loud once it’s on the lunar surface?
BILL: No, on the moon it’ll be very quiet.
CHRIS: That’s true.
BILL: Without an atmosphere, those sounds will be vibrations in the Rover but you won’t be able to hear it audibly.
CHRIS: After we spoke with Bill, we had a chance to take the RP out to the rock yard and we talked to the design engineer, Mason Markee.
CHRIS: Mason, it looks like we have a new rover in the works, the Resource Prospector.
MASON: Yeah, that’s right.
CHRIS: What’s it all about?
MASON: Yeah, we call this the RP rover. This is basically a prototype of a robot we want to send up to the moon to go to the North or South Pole of the moon. We want it to drive around on the Moon’s surface and then go down in the craters that haven’t seen light in a billion years. And once we’re in those craters we want to look around, find a interesting spot where we might think there are some organic compounds and drill down, take a soil sample, bring that up into the robot, process that soil sample, and then see if we can make a few drops of water from all that. If you can make water when you’re on the moon that proves a lot to us. That means you don’t have to bring everything next time you go to the moon. You don’t have to bring your water, and if you have hydrogen, you can make fuel there as well, which would be incredible.
CHRIS: What is your role with the Resource Prospector?
MASON: So, I was one of the engineers working on the rover side of it. This is a multifaceted project where we’ve got engineers at Ames, we’ve got engineers at JSC, KSC, and I worked on the rover side of it. I was doing the chassis design, basically making sure that everything is solid on there, holding all the wheel modules together, integrating all the payload components onto heat spreaders. We’ve got a radiator on top and a solar panel on the back, which are mock-ups right now but that was all part of the chassis structure that we put together.
CHRIS: Now, NASA’s built a lot of rovers in its days, and looking at the size of it, what’s the size comparison to the other rovers in the past?
MASON: This falls right in between the Spirit and Opportunity rovers and the Curiosity, so it’s that blend right between them. It’s not the biggest rover we’ve ever made, but it’s definitely not the smallest.
CHRIS: What are some of the differences from the engineering side, looking at the chassis?
MASON: A lot of the rovers you see out there have six wheels. They’re a different set up because those front and back wheels can move in any direction. They’ve got four-wheel steering but the middle wheels kind of stay stagnant, they don’t do any kind of steering. On this vehicle we have four wheels, but all four of them can rotate 360 degrees. We can make this vehicle drive completely sideways. We took that from a lesson learned with the Spirit and Opportunity rovers where they were getting stuck in sand and some tight spots. They were always able to get out. They’re really capable robots, so we thought we could do a little bit more by having each one of the tires be able to spin.
CHRIS: You’ve been around for about a year now with this project?
MASON: Yeah, that’s right.
CHRIS: What are some of the challenges that you’ve faced so far?
MASON: One of the big ones was trying to build a robot that worked here on Earth so that we could do our performance testing here, but fit the size build that we wanted to go on a rocket to be able to get to the moon. This robot represents the size of the real rover that we want to send to the moon. Of course, the weight of it isn’t quite accurate. We couldn’t build it six times lighter so that the forces on the wheels were the same, so we had to make this strong enough to stand up in earth gravity, but only fit in the size to go to the moon. We also wanted to tackle a lot of the aspects that you need to go to the moon, so working in a vacuum, working in extreme temperature differentials, so it’s really hot in the sun, really cold in the shadows. We started tackling some of those technologies and laying this out, so that the thermal properties of this robot are very similar to the one that we want to send to the moon eventually.
CHRIS: Are you going to actually be able to test this rover in a 1/3 G environment like the moon?
MASON: Ultimately we want to put this on ARGOS, which is our gravity offload machine. Right now we have the parts coming out of the shop to mount this up to there. We’re going to see how this rover performs when it doesn’t have all the gravity on it, which for us with Terra Mechanics, how it moves through soil is really important, because out here we’re not quite getting accurate testing because the wheel is sinking down in the ground more than it ever would on the moon because of that weight difference.
CHRIS: I noticed on the front half of the rover, we have the science instruments sticking up from the top there?
MASON: Yeah, we actually have two structures popping up there. We have the drill, which is made by Honeybee Robotics, that’s what we actually go down and take that soil sample with. And then we have a mast, and that mast can retract. It stows in the down position, which is how it would be for launch. It releases, goes to an upright position, locks in. On top of that mast we have cameras and lighting and there’s going to be an associated kind of 3-D mapping structured lighting on that. So that’s on the pan/tilt unit, and then on top of that we have the comm dish, which is on a pan/tilt unit, which we use to talk back to Earth.
CHRIS: Is the idea behind this rover very similar to say Curiosity, Spirit and Opportunity where you’re going to be sending it a set of commands for it to perform each day on the moon?
MASON: Yeah, absolutely. What’s different about this though is it’s only going to be a six-day mission on the moon.
CHRIS: Oh wow.
MASON: Yeah, we’re not going to be able to have the energy sources. It’s going to get too cold; we’re going to lose the sun so there’s a very small window of time that we can go up to the North and South pole. We’re going to be power packing as much into those six days as possible.
CHRIS: Now, I do have one question. I didn’t see a pickaxe on the rover. Did you forget that on purpose or do you think you’re going to put a pickaxe on there?
MASON: Ah, no. We don’t need a pickaxe.
CHRIS: The only reason why I’m asking that is Yukon Cornelius, who’s the greatest prospector in all the land…
MASON: Yeah, I know him well.
CHRIS: … that’s how he got his gold. I’m wondering, it looks like it has some gold color to it. It’s a prospector mission. You’d think it would have a pickaxe on it.
MASON: Maybe I’ll take that back to the board, you know, as a lesson learned or something. Yeah, need more weaponry.
CHRIS: That’s a good call.
CHRIS: You know, one of the great things about covering these missions is looking at the size of these rovers and the technology. This rover, as Mason was saying, is in between Spirit and Opportunity and Curiosity.
STEVE: Yeah, it’s a little bigger than the size of this table.
FRANKLIN: I think one of the interesting things about the mission is that it’s going to be mining for water in the polar regions.
STEVE: That’s what’s game changing about the mission.
CHRIS: High risk, high reward.
FRANKLIN: You talk about risk reward, this has to be the fastest mission that I think I’ve ever seen out of NASA that is not going to actually operate that long on the lunar surface.
STEVE: Absolutely, Franklin. You hit a hot topic. We’re trying to do it as cheaply, and as fast, and as safely as we can.
CHRIS: And get all the data that you need.
STEVE: And get the data that we need. Speaking of high risk, high reward, remember we were talking about R2?
CHRIS: That’s right.
STEVE: Well B recently had an opportunity to go learn more about R2.
BLAIR: Vienny, the last time I saw R2, it was only a torso and it was in a lab in another building here at Johnson Space Center. Can you tell us where R2 is now? I mean, obviously you’ve made some improvements.
VIENNY: Since the last time you were here, we’ve sent an upper torso up to Space Station in order to start learning how to help astronauts perform tasks side by side. That was just the upper torso and was fixed to a stanchion. He could not move, could not be mobile. We sent up a pair of legs a couple years ago, had them installed, and now he’s mobile.
BLAIR: And literally walking around station or grabbing?
VIENNY: Not yet. We are currently working on teaching him how to look for handrails, avoid obstacles, and that’s exactly what we’re doing here today in this mockup. We have some of the walls modeled from Space Station as you can see here, and then hand rails just littered across this mock up because that’s exactly how it looks on station.
BLAIR: That brings up an interesting challenge. You’re down here on the ground, developing things for R2, and then you actually have to figure out how to teach R2 up on station to do that. How does that happen?
VIENNY: What’s great is that we have an Internet connection to Station, albeit a very slow internet connection, but we can push up software updates to the robot. As we do development here on the ground, and we polish off a new feature or tool, we can just send that up to the robot after it’s been safety tested and Robonaut on Station has new capability.
BLAIR: Do you have to work with the astronauts on something like the legs, which seems to me he can’t just put them on himself? You have to work with them in conjunction with astronauts so they can help Robonaut with some of the new hardware additions?
VIENNY: A lot of tasks on station, we have to go through training with astronauts and basically teach them how to perform surgery on a robot.
BLAIR: Oh, exciting.
VIENNY: Yeah, it is. You get to play doctor for a day because you get to open up Robonaut, open up all of his covers and literally dive into his innards in order to make some of those hardware connections and add or replace hardware.
BLAIR: Vienny, looking at the setup of the ISS here, you’re obviously trying to simulate that environment. How do you simulate reduced gravity?
VIENNY: That’s actually a great question. This mockup is the ISS, but if you look at the gimbal that R2’s attached to, it’s actually attached to another robot. So we have a robot inside of robot, and this robot is called ARGOS. It’s a gravity off-loader system. What it does is it accounts for gravity in the Z direction. We can at least remove that part of gravity and allow him to operate in a higher fidelity environment, close to what you would see on ISS.
BLAIR: I see you have a demo set up here. What kind of things would R2 be doing in a demo here in the lab?
VIENNY: The demos that we’re trying to recreate are as realistic as possible, so what we envision for Robonaut is for him to be an astronaut assistant. One possible scenario is that an astronaut has spent all day performing tasks, performing experiments, making repairs on Station and he just could not get to something at the end of the day.
BLAIR: They’re great people but they do need sleep.
VIENNY: They do need sleep, and they have a really hard job up there because their day is scheduled very strictly and sometimes things don’t always go right. Something that R2 can do is at the end of the day while astronauts are sleeping, he can un-stow out of his little cubbyhole and go around and perhaps collect all the tools that the astronaut needs to perform tasks for the next day.
BLAIR: I want an R2 for my house. That’d be great for me.
VIENNY: Wouldn’t that be awesome?
BLAIR: That’d be great. That’s actually awesome for the astronauts; not only do they get the rest but it’s more efficient for the tasks that they’re doing.
BLAIR: Which is great. I see you have the classic phaser up there. Is it on stun? Is it on safe mode? How does that work?
VIENNY: That’s actually an RFID reader. We have a new system on station where we tag everything with an RFID tag and that helps us to actually keep track of things. I don’t know if you’ve seen pictures of how much stuff is strapped down and flying around on station, but this really helps the astronauts stay organized and locate things more quickly. So that’s what that tool is used for is inventorying and finding out what is in a cubbyhole before they actually open it up.
FRANKLIN: Steve, we’ve really made some technological advances with R2 over the years; now we’ve added these um…
STEVE: Crazy legs.
FRANKLIN: Crazy legs.
STEVE: That’s what we like to call them. A technical term.
FRANKLIN: And now it’s operating on ARGOS to simulate 0 G. That stuff is mind-blowing.
STEVE: It is.
CHRIS: The cool thing is that you have this humanoid robotic, R2, working with another robot, ARGOS. It’s sort of like a robot working with a robot.
STEVE: You know what’s even cooler? R5. B did such a good job with interviewing R2, I sent him over there to talk with R5.
CHRIS: What’s up with B?
BLAIR: Chris, I noticed that R5 looks very similar to R2 but is it more of a new replacement program or is more of a complimentary robot?
KRIS: Actually, we call R5, Valkyrie. The robot’s name is Valkyrie. Valkyrie is specifically intended to be a robot that works on a planetary surface, either here on Earth or in the future on Mars or the moon or some other surface. The zero-G legs that you see on the robot are perfectly adapted for a space flight environment. The legs on Valkyrie are much better adapted for walking in a gravity field.
BLAIR: What are the challenges you have developing a robot for a gravity environment, especially since they’re more human-like?
KRIS: The biggest challenge going from R2 was we had to make everything lighter. The body of R2 with the legs is in the neighborhood of 500 or 600 pounds. We were trying to get Valkyrie around 100 kg. We didn’t quite make it that low but right now the robot weighs in at about 300 lbs. We’re looking to find ways to reduce that even further. That’s primarily because we’re running off of battery. To run untethered off a battery, every pound that you’re carrying is shorter battery life. It’s like a motorcycle or anything that needs gas mileage; you want to have it be as light as possible.
BLAIR: How are the legs being developed? I am assuming they have to be pretty light as well.
KRIS: You notice the legs on Valkyrie don’t look a lot like the arms and the legs on Robonaut do. The Valkyrie legs are specifically adapted to be a walking type environment. They’re also made to be humanoid. We make humanoid robots to work in a human environment. Most of the world is built so humans can use it. It makes sense to instead of let’s go back and rebuild everything so robots can use it too. Why don’t we build a robot that can use the same kind of interfaces as a human can use? You end up with a humanoid robot pretty quickly. The legs on Valkyrie we pulled from a number of different projects, including our exoskeleton project to reduce the form factor and reduce them in size so they would basically fit into those same kind of environments that a human would use.
BLAIR: How would creating human-like robots help us for space exploration?
KRIS: We’re going to go build another infrastructure on Mars. We build that infrastructure so the astronauts can use it. My boss says we build robots for space but you don’t send robots to a place you want to go and we all want to go to Mars. We’re going to send a robot there to make things right for people, to help the people out and maybe participate in a caretaker role. If we have an asset or base on Mars, maybe it’s not staffed all the time. Maybe we need somebody to take care of it while people are flying back and forth. I think a robot like Valkyrie would be great for that, or Robonaut.
FRANKLIN: Steve, the light on R5’s chest, was that done to rope in kids to get them interested in the technology we have here at NASA?
STEVE: I can neither confirm nor deny that comment but look at my eye. Am I winking? Yes. We were trying to reach that younger generation for all the right reasons. Like the Avengers movie, it’s a big hit all over the world. Even my little, three-year old grandson is all about those characters.
CHRIS: Speaking about the younger generation, you talked earlier in the show that you had these centennial challenges where university students are going to be working with R5.
STEVE: Right. That is really cool. We’re developing two R5 units. We’re going to give those to universities to do upgrades, next generation software for walking and manipulation. Through solicitation somebody is going to get one of those and there will be a competition called Space Robotics Challenge.
CHRIS: The university students potentially could improve R5.
STEVE: That’s correct.
CHRIS: And take that technology back to NASA.
STEVE: That’s what we’re hoping.
FRANKLIN: Steve, you’re handing over R5 to colleges and universities across the country. How about handing over the Modular Robotic Vehicle to me?
STEVE: Well Franklin, I’m afraid that might be a long line that you’ll have to get in, a lot of interested people in the MRV.
FRANKLIN: When we were at the Johnson Space Center, I had the chance to talk with Justin Ridley about the MRV. He actually took me out on a ride.
CHRIS: You lost your stomach, didn’t you?
FRANKLIN: Justin, we’re here in the Modular Robotic Vehicle or MRV. What makes this a game changing vehicle?
JUSTIN: This is an all-electric vehicle. It’s based on some drive-by-wire technology. Where a typical vehicle would have mechanical linkages, this has got an electrical system. For instance, this steering wheel in a normal car, there would be a rod going down to rack and pinion system and that steering wheel would control the wheels of the car. Here, I’ve got sensors that are detecting the steering wheel angle. There’s a computer in the back that’s telling these wheels which direction to turn. This is technology that has been used in aircraft for a number of years that we’re implementing into a terrestrial vehicle.
FRANKLIN: You talk about computers. I’m looking at the steering wheel here; there’s a little joystick. It feels like this is almost like an arcade game.
JUSTIN: Yeah. The steering wheel here was modeled after some high-end video game controller systems. Just like where a video game player would be driving a racecar around, he would want to be able to feel road. Right now, without a force feedback system, I wouldn’t feel the road. There’s no linkage between this and my wheels. We’ve got a system here that adds some resistance to the wheel as I go around corners. It centers the wheel as I come out of a corner. It gives me some feel of the road that I wouldn’t otherwise have. It is very video game like.
FRANKLIN: What about this joystick in the middle? What does this do?
JUSTIN: Each of the wheels are completely independent in steering. They can rotate on their own +/- about 180 degrees. That gives us some control that a typical car or truck wouldn’t have. This joystick allows me to yawl the vehicle while steering it around. It’s hard to describe but it allows me to basically drift around corners and crab sideways, diagonal. I can drive in a bunch of different directions and I can use this joystick to help me do that.
FRANKLIN: This car is awesome. Can we take it out for a spin?
JUSTIN: Absolutely. It’s a lot of fun to drive. Let’s do it.
[Sound of wheels moving]
FRANKLIN: This car can actually continually go in a circle in place?
JUSTIN: I’ve got about 180 degrees. I can’t continuously go in place but I can drive forward and get basically all the way around as I’m driving forward. We call that a parade mode. When I’m in a parade and the people over there, I can tilt the vehicle over there and wave to them and tilt it back and drive away.
FRANKLIN: What’s the top speed in the MRV?
JUSTIN: The vehicle is designed to go about 40 mph. It can probably go a bit faster than that but right now we’re still learning some of the systems so we limit the speed to 15 mph.
FRANKLIN: When I look at some of those Smart cars. They’re really small like this size, maybe even a little smaller. Those cars are made to get into those parking spots, parallel park. You can actually pull up to a spot, turn your wheels and just go on into it.
JUSTIN: That’s right. We tried to make this about Smart car size. It’s just a little bit longer than a Smart car but if I go to a parking spot, I can go to a specific mode in the car, take my steering wheel to about 90 degrees and parallel right to the spot.
FRANKLIN: You just gestured to the display on the dashboard. What is that used for?
JUSTIN: The display here gives me a bunch of information about the vehicle. A lot of the information is something you would see in a typical vehicle, such as your speedometer and odometer giving your mileage. We’ve got some additional information, such as temperature in each of these wheel modules I mentioned before; only because the wheel modules do rotate so much. I’ve got the data display that shows me the direction of each of those wheels. If I’m at my full 180-degree rotation, I can see where everything is pointed and what direction I am going to go. I have a few different driving modes. I can select those from the panel here. I’ve got a regular two-wheel steer mode. We tried to develop the vehicle to feel as much like a traditional automobile as possible. The steering wheel, brake & gas pedal all feel like a car. When you’re in two-wheel steer mode, hopefully you don’t notice too much of a difference from a real car. I’ve also got four-wheel steer mode. That allows me to control all four of the wheels together. In that mode, I can get full 360-degree turning, right on the center axis of the vehicle. Think of really maneuverable steering. I also have omnidirectional mode. That’s where the joystick comes in handy. When I’m in omnidirectional mode, the steering wheel here when I rotate it controls all four wheels together. If I rotate it to the left, all four wheels will turn to the left. This joystick here controls the yawl or the direction of the vehicle. I rotate the joystick to the left, it’s going to steer the vehicle to the left. I can combine these two inputs to get some of the more crazy drifting maneuvers that you see the vehicle doing.
FRANKLIN: Wow! Whoa, ho, ho!
JUSTIN: Oh yeah, you might want to keep that stuff.
FRANKLIN: How many hours do I have to spend in the simulator to drive this car before I can actually take it on the road?
JUSTIN: We did have a simulator to develop some of this advanced maneuvering capability. I didn’t spend anytime in that simulator, so I just ran off the road quite a bit while I was learning how to drive it. What I described when your controlling both the steering wheel and joystick, that’s not exactly intuitive. It takes a little while to learn how to do that. You want to have some area where you can roll into the grass a little bit off the blacktop. Driving around there for about an hour or so, you can start to get the hang of it.
FRANKLIN: All around us there are rovers that are designed for the journey to Mars but this vehicle doesn’t look like any of those other rovers. How is this supposed to be used for future space travel and maneuverability on Mars?
JUSTIN: This vehicle wouldn’t be used for space travel. The vehicle was designed as an urban vehicle; New York City, getting in and out of parking spaces and traffic. Why is NASA building a car like that? A lot of those technologies I’ve talked about, this by wire technology, those are things we want to have in our vehicles for the moon, for Mars, for things we’re going to use to explore space. We can learn a lot about those technologies by developing a car like this. For instance, all that by wire technology I mentioned, that could create some unsafe things when you’re driving down the road. When I’m driving down the road in a typical car, nothing is going to prevent this steering wheel from detaching from the wheels of the car. That’s not going to happen. But here, there are a number of things that could happen. A wire could get cut. That computer could fail, a motor could die, a sensor could go bad. If any of those things happen we have to have redundancies in place so a backup sensor can take over, another computer can reboot and basically control the vehicle instantaneously when any of those failures happen. You want to have the same thing on Mars. When you’re driving on Mars, you have to have redundant systems as well. That, in addition to some other technologies that we put in here are things we can draw from MRV to use for our next rovers, our robots, things like that.
CHRIS: That Modular Robotic Vehicle, that is awesome.
CHRIS: What was the riding experience like?
FRANKLIN: It was great. I actually felt like I was at an amusement park on a new ride that was going backwards, and forwards but I was actually riding in a car. It is a type of experience that you’ll never forget.
CHRIS: Would you drive it to work everyday?
FRANKLIN: Absolutely. I would park in the Smart car, the energy efficient spots.
FRANKLIN: You could probably put that in a motorcycle spot.
CHRIS: Correct me if I’m wrong. Justin said that vehicle is not designed for a Maritain environment or a lunar environment. It’s more designed for an urban environment but you could use some of the technologies in that vehicle in the future.
STEVE: Absolutely. Our plans are to use it as a test bed. We’ll put a high-energy battery in there. We’ll put a regenerative fuel cell in there. We’ll put some new steering apparatus in there, maybe some compact tight linkages and some communication and navigation.
FRANKLIN: How about a door and some air bags?
STEVE: If you get to drive it, we might consider that.
FRANKLIN: Speaking of driving it. I want to drive it. The next time we go to JSC, can you make sure I get an opportunity?
STEVE: We can make sure.
FRANKLIN: I appreciate it.
STEVE: I’ve got connections.
CHRIS: What he really wants to do is put that MRV in the next Fast and Furious movie.
FRANKLIN: That would be…
STEVE: Already in the works.
FRANKLIN: Ah, yeah.
STEVE: You got that right.
STEVE: And I’m going to be Dom’s sidekick.
CHRIS: That would be game changing.
STEVE: It would be.
CHRIS: We’ve got one beef with you before we wrap up the show. B, why you calling Blair, B?
FRANKLIN: What’s that all about?
STEVE: Blair? It’s B1138. It’s our newest robotic prototype.
CHRIS: Get out of here.
CHRIS: I mean, anybody who uses Blair as a robot, that’s high risk, no reward.
STEVE: Well, you got me on that one. Yeah, yeah. Let me show you.
FRANKLIN: What in the world is going on here?
CHRIS: Wow. Seriously? If that’s a robotic Blair, then where’s the real Blair?
ENGINEER: No cognitive response at all. Really bad AI.
ENGINEER: Recycle what you can and discard the rest.
BLAIR: Recycle? This is all 100% usable. What do you… I’m not a robot. I’m a human being. I’m full cognitive. I got an 800 on my SAT at least one of them. Um, I can take it again. I’m human! I’m a human being!(c)2015 NASA | SCVTV
SCCF: Prodigal Son Pt 3