Advanced Computing in the Age of AI | Tuesday, April 16, 2024

Deep Learning that Keeps Learning – Near-Time Training at the Edge 

Deep learning is still a stick-figure emulation of human learning. DL can be trained up to have a supercharged IQ based on an astounding amount of information, enabling it to execute some tasks beyond human capability. But deep learning has a learning disability.

People learn continually (it’s hoped) – every day, in real time. But deep learning is typically based on "backpropagation training methods," which has a major flaw. Think of a college graduate who enters the workforce with a fount of knowledge, but the knowledge bank is static. The only way the graduate can learn something new is to start college over from the beginning, re-learning everything previously known while incorporating the new knowledge.

In AI circles, this is called “catastrophic forgetting.” Here’s how researchers at Cornell University explain it: “Once a network is trained to do a specific task, e.g., bird classification, it cannot easily be trained to do new tasks, e.g., incrementally learning to recognize additional bird species... When new tasks are added, typical deep neural networks are prone to catastrophically forgetting previous tasks. Networks that are capable of assimilating new information incrementally, much like how humans form new memories over time, will be more efficient than re-training the model from scratch each time a new task needs to be learned.”

With grants from NASA and DARPA, deep learning specialist Neurala began working on catastrophic forgetting earlier this decade. Two years ago, Neurala announced a $17M round of Series A venture funding to take its software into the commercial sphere, and now the Boston-based company may be emerging as one of the more interesting start-ups on the AI landscape. Last year, the company won an Edison Award and was named a top 100 AI company by Fortune Magazine.

Founder and CEO Max Versace told us that the company’s Neurala's Brain Builder SaaS platform, designed to enable systems such as robots and drones to learn in near-time at the edge, uses a bio-inspired approach to mimic the way the human brain learns and analyzes its environment. In a nutshell, the idea is to enable not just inference but also training at the edge. At the heart of the company’s Lifelong Deep Learning Neural Network software is a proprietary algorithm that, according to Versace, allows devices to learn in 20 seconds what would take a traditional DNN 15 hours to converge.

In an interview with SearchCIO, Neurala COO Heather Ames explained the algorithm this way:.

Badger robot with Neurala deep learning (source: Neurala)

“Let's say you trained up a DNN on people and dogs,” she said. “You port it onto your cellphone and you're walking around with your cellphone looking at people and dogs and taking video around you. It does a pretty good job of catching most people, but when it comes to dogs, it misses black labs because, perhaps, the data set you used to train it didn't include any black labs.

“But there are black labs everywhere where you are. So what a lifelong DNN allows you to do is to train on that black lab right then and there. You aim the device at the dog, put that into your video feed and tell the system, ‘Hey this is also a dog.’ You have a [user interface] to outline the dog, label it, and now your system can automatically update that new information and add it to its understanding of dogs.”

The algorithm, Versace said, overcomes the restrictions of backpropagation, which, he said, “when you transport this to enterprise AI applications, you find that the paradox is really damaging. If you can only train before deployment, what about all the other things the system could learn after you deploy? What if something changes in the environment, what if you introduce a new product, what if the lighting conditions in your factory changes...? When you look at this, it’s amazing we're still stuck in this old era.”

Badger Technologies, acquired in 2017 and operating as a product division of $22B manufacturing services company Jabil, has incorporated Neurala technology into autonomous robots for the retail industry. Earlier this year, CEO Tim Rowland told us, Badger rolled out nearly 500 of its robots to Giant/Martin’s and Stop & Shop grocery stores on the East Coast. Rowland claims it to be one of the largest autonomous robot implementations in the grocery retail industry, and said they will be used to “automate hazard detection and improve in-store operations while elevating audits and compliance reporting.”

This means the Badger robot, in its initial version, will move through aisles collecting more than 1 million images per day, looking for things like broken glass bottles or spilled containers in areas that could pose safety hazards to customers. Upcoming Badger robots will also handle tasks like store shelf inventory inspection, price checking and identifying products whose sell-by dates have expired – in short, to help bring order to what Rowland called “the chaos of retail.”

“Neurala allows us to learn from those pictures: a clean floor from a dirty floor (or a suspect dirty floor), evaluated by their Brain,” Rowland said. “It’s been trained with many images to evaluate an aisle that is free and clear, or if something’s there that needs to be routed to someone for secondary assessment – something fallen off a shelf, something spilled, leaked from a cooler, or whatever.

“The beauty of it is not only the fast initial training and getting that implemented, which allows us to be much faster and more accurate and consistent,” he said, “but over time the training, as we get more and more images, it gets better and better at identifying, the accuracy continues to rise.” Ultimately, the retailer will rely on the robot to identify problems that need to be addressed rather than secondary auditing by a human.

He added that the robot runs on its own: at the time it’s programmed to make its rounds, it unplugs itself from its charger, inspects the store floor, returns to its charging station and plugs itself back in.

Rowland said Neurala’s “adaptive learning” capabilities made the company stand out as Badger evaluated deep learning software for its robots.

“As you introduce a new environment to the robot, say you move it to a different grocer that has a different color scheme on the floor, different gloss or different lighting, you have to re-train that machine,” Rowland said, exposing it to 4 to 5 million sets of images, a slow and compute-intensive process. “You go through all the same steps, that’s where the common group of Neurala’s competitors are. Everything is brute force and done from the start… But Neurala is on the path to doing that dynamically.”

As for inspection accuracy, Rowland said the robots have proven to be “at least as good” as humans and are improving with time. “At this point I’m believing the machine is starting to surpass the human capability. Our auditing is about 95 to 98 percent accurate with a human; we’ve been able to match that and testing shows exceeding that.” Furthermore, Rowland said, it will remove a repetitious task from store employees, freeing them up for more productive work. “It’s very tedious to constantly be viewing these images.”

EnterpriseAI