Advanced Computing in the Age of AI | Thursday, April 25, 2024

Inside the Human-Robotics Systems Project 

<img style="float: left;" src="http://media2.hpcwire.com/dmr/640px-K10_black_rover_at_Haughton_Crater.jpg" alt="" width="95" height="64" border="0" />NASA's Human-Robotic Systems Project, which includes a series of four interplanetary K10 rovers, each with a different task is explained by Terry Fong, Director of Intelligent Robotics Group and project manager of NASA Human Exploration Telerobotics.

When you ask an ordinary person about the role robotics will have in the 22nd century, they will most likely talk about a robot to drive their car, mow their lawns, or perhaps even conduct military operations. When it comes down to it, the point of robots in any role is to take over or assist humans with tasks that they wouldn't do alone, be it out of preference or necessity.

And when you talk about space, using robots makes a lot of sense—they don't have to go through extensive training, don't need an atmosphere to breathe in, and don't have to adjust to null gravity.

NASA's Human-Robotic Systems project is designed with this goal in mind, although there's still quite a bit to be done to help human operators control the four interplanetary K10 rovers, each of whom will be given unique tasks.

Terry Fong, Director of Intelligent Robotics Group and project manager of NASA Human Exploration Telerobotics, noted that in the case of this project, collaboration between humans and robots will need to be handled carefully by a team, who will monitor physical interactions between the two with collision detection algorithms and sensors.  Because rover activities and transference of information is critical to each K10's mission, Fong said that Human Robot Interaction (HRI) methods would need to be employed.

One of the K10 rovers is fitted with 802.11 Wi-Fi communications and Fong stated that that type of wireless communication was best suited for earth-based tests.  On the other hand, when a larger group of robots is being used in space, higher-level communication (such as the Data Distribution Server) becomes necessary.

Fong stated, “Getting four complex robots with very different designs to use a common data system was challenging. The Data Distribution Service for Real-Time Systems [DDS] standard supports very flexible service parameters. We found that we could adapt the middleware to the unique needs of each robotic system.”

In the interview, the question was brought up of whether or not the K10 could “right” itself if it happened to fall over.  Fong cleared it up by saying that it would be kept away from steep areas and that it wouldn’t need to “right” itself.  He went on to say that the K10 has an all-wheel-drive unit as well as all-wheel-steering capabilities with the ability to even out ground pressure.

Since the K10 will need to sense the world in which it travels, it has maps in its software that should assist.  Sensors such as LIDAR, which is a remote sensing method that uses light in the form of a pulsed laser to measure ranges to the Earth, will be used.  The light pulses along with other data generate precise, 3D information about the surrounding environment.  On top of this, it also has a GPS, digital compass, sun tracker, and an inertial measurement unit that is used for navigation. 

The K10 is run on embedded software on a dual-core Linux laptop and uses 20 laptop computer lithium-ion batteries to power its four-wheel drive, all-wheel steering, and sensors. 

They have run some tests on the K10, one of which was the Haughton Crater Field test on Devon Island, Canada. 

Full story at EDN Network.

EnterpriseAI