Advanced Computing in the Age of AI | Saturday, April 20, 2024

Rolling Out the Advanced Scale Computing Crystal Ball for 2018 

We all know where we are and where we’re headed. We’re early into the AI Era, and it has an end-game aura about it. Not in an apocalyptic, “end-of-days” way, but in the sense that AI (broadly defined) is the natural culmination of everything we’ve always wanted computers to do: analyze massive data volumes in virtual real-time producing richer, predictive results from which our systems learn and become increasingly insightful. And then there’s AI-guided robotics for performing tasks that waste our time (driving) or make work an exercise in drudgery (checking forms, doing inventory… we could go on).

Who wouldn’t want machines to do all that? What else could we possibly ask of them?

Like the gods of Greek mythology, the AI Era is characterized by virtually limitless impacts and massive contradictions: the power to create and to destroy (jobs), to do good and bad (cure and kill), to enlighten (find new insight) and confuse (opaque decision making).

An interesting contradiction: AI of the future challenges our credulity, yet we quickly  become blasé about what it can do now. My phone, prompted by a voice command, can tell me last night’s Red Sox score? My car can tell me I’m about to back into a fire hydrant? Please, that is so 2012. What else you got? As Australian roboticist Rodney Brooks has said, "Every time we figure out a piece of it, it stops being magical."

For all its prominence, AI is easily overlooked.Stuart Frankel, CEO and co-founder of Narrative Science, sees this as a permanent quality of AI. “The term ‘Artificial Intelligence’ will become obsolete as AI becomes invisible,” he said. Yet this invisibility will need to be counteracted as our reliance on AI in critical roles grows. Increasingly, he said, “Regulations around AI will be instituted, requiring transparency into AI systems’ decision-making.”

Another AI contradiction: there’s fear AI will eliminate jobs, but many (though not all) of those jobs are hard to fill: low pay, hatefully tedious, with high turnover and chronic absenteeism. In the same vein, AI is a job killer and creator. Gartner predicts that in 2020, AI will generate 2.3 million jobs while eliminating 1.8 million (on the other hand, the AI Impacts think tank predicts AI will automate all work within 120 years).

Another contradiction: AI is always, simultaneously, at an advanced (compared to where it was) and primitive (compared to where it’s going) phase. AI empowers the individual by helping us work smarter and live better; it also empowers the few megaliths (FANG companies) with the data, compute and technical expertise to do real AI (though important work is being done to broaden AI adoption). AI getting smarter means people, relative to AI, are getting dumber. And so forth.

All of which is preamble to a prospective look at 2018, which, it’s safe to say, will be AI-dominant. In accordance with Gartner’s hype cycle, we’re still in the anticipation phase but the trough of disillusionment will arrive, along with increasing focal points of accomplishment and (unlike the classic model) no plateau of productivity, just a perpetual up-slope of achievement. AI will have no end.

Here’s a sampling of the more interesting 2018 prognostications we’ve received.

The gulf between AI talk and action is taken up by Akshay Sabhikhi, CEO/co-founder of CognitiveScale, who asserts that AI will not be commoditized any time soon.

“Only one in 20 companies has extensively incorporated AI in offerings or processes,” he said. “Less than 39 percent of all companies have an AI strategy in place. According to MIT Sloan Review, the largest companies — those with at least 100,000 employees — are the most likely to have an AI strategy, but only half have one. Despite claims that AI is already being subsumed into an array of applications, we’re not there yet and won’t be in 2018. It is still the early days of adoption, and those companies that are implementing AI now will see the biggest competitive value.”

On the funding front, Sabhikhi foresees increasing money spent on taking AI from experimentation into actual work.

“In 2018, we will see budgets for AI shifting from innovation to operations as more companies realize the transformative benefits of moving AI out of the lab and into practical operations within their organizations,” Sabhikhi said. “Because there will be this shift, chief data/technology officers will serve a more important role within their organization as they take experimental AI and make it ‘real’ business.”

Addressing the AI ( and machine learning) implementation challenge, said MapR Chief Application Architect Ted Dunning, will require recognition that “90 percent of machine learning success is in the logistics (rather than the algorithm or the model).

“It may sound less exciting or cool, but being able to effectively manage data is essential to running successful machine learning systems in the real world,” Dunning said. “This is true for the complete life cycle - from managing input data to the development of machine learning models, to their ongoing maintenance in production. The good news is that with effective architecture and good planning, much of this can be handled at the platform level rather than the application level – and that cuts across many systems handled by different machine learning tools. In other words, you don’t have to come up with a new plan for logistics with every different project.”

Sabhikhi predicts that an operational area where AI will become particularly useful is compliance. As has been declared in this space, regulatory compliance on Wall Street and among big banks is a major cost center. “Companies in many industries, particularly financial services, must follow government and industry regulations,” he said. “As a means to ensure compliance while simultaneously reducing the effort that’s involved,  we will see a new interest in machine readable policies. AI will automate  the labor-intensive process associated with compliance, freeing humans to focus on business-building efforts instead.”

In AI, semantics matters. This is because, as has been widely noted, marketers attach “AI” to many forms of automation, regardless of whether it rises to the level of AI. In this sense, Sabhikhi said there’s an important distinction between AI and robotic process automation (RPA).

“RPA is an emerging form of clerical process automation technology based on the notion of software robots,” he said. “However RPA has very little ability to actually learn, which will keep it focused on mundane, rules-driven, repetitive tasks. While serving this need will keep RPA growing in popularity through 2018, true AI, driven by machine learning, will produce the greatest ROI. These two technologies will be mentioned together often in 2018, but for ROI, AI will be seen as the clear winner.”

We’ve written about AI automation of data science tasks, and associated with this is the coming AI-driven melding of data science and software development, according to Sabhikhi. “There will be an expectation for software developers to have basic data science skill sets. Data science and software development within an enterprise remained very separate until the adoption of AI. As AI moves away from experimentation and into operation, developers with these skills will be in high demand. Universities and developer training programs will adapt their curriculums to foster these skills.”

The AI-related phenomena of IoT and edge processing, and the resulting hyper connectivity, are viewed by Jake Freivald, vice president at Information Builders, as trends to watch during the coming year.

“In 2018, we’ll see the effects of a hyper-connected universe as IoT becomes ubiquitous,” he said. “Connected refrigerators and personal electronics, such as Alexa, are already mainstream as sensors become less expensive. Now consider how organizations are adopting IoT – traffic cameras that register everything from license plate images to pollution levels and even traffic speed, wearables that capture health data, such as your pulse, and so-called ‘industrial internet’ configurations that include thousands of sensors on a single manufactured item, such as a turbine.

“Generating vast amounts of information, these sources will potentially provide non-traditional data, such as video and audio, that must be integrated to provide quality insights. Some intelligence will reside ‘at the edges,’ where the sensors are; however, the data often must be analyzed in tandem with other data sources to offer value… Organizations will strive to rapidly ingest, contextualize and analyze big data from the IoT and act on information from the connected world in new ways… As organizations realize the benefits of IoT data and their investment in analytics, they’ll begin to consider how they can squeeze more advantages from the technology, and real-time analytics is where they’ll turn.”

For data center managers who appreciate uninterrupted sleep, AI will have a valuable impact, according to Rich Rogers, SVP, IoT Product and Engineering, Hitachi Vantara, a US-based Hitachi subsidiary.

“2018 will be the year that data centers begin to transform into fully autonomous operations,” Rogers said. “IoT and AI will enable data center issues to be root-caused and resolved automatically by software.  Data center administrators will no longer be woken-up at night to troubleshoot outages.  Voice technologies will enable data center operators to monitor and manage their data centers from any location – be at the grocery store, gym or living room couch.   IT Infrastructure gear will be deployed and maintained autonomously – you simply stock new compute nodes and disk drives and robotics streamline the technology to the appropriate systems.”

The significance of IoT and edge computing also is on the mind of Couchbase SVP of Engineering and CTO Ravi Mayuram.

“Cloud computing revolutionized virtualization and ushered in the digital era, and now edge computing will bring those digital learnings back to hardware for applications that extend customer engagement in novel ways,” he said. “Industrial IoT applications, sensors and VR-powered devices use edge computing to provide offline capabilities that deliver the seamless, real-time experiences modern users expect. Data capabilities and chip technology are now advanced enough to support real-time compute at the edge, and 2018 will see organizations updating infrastructure to take advantage of the benefits of edge computing.”

Rapid analysis, instant cognition, of data is at the heart of AI (again, broadly defined), and Gary Orenstein, CMO of MemSQL, developer of in-memory database management systems, sees real-time data as a major trend of 2018.

“We have tackled how to capture big data at scale. But we have not yet tackled the instant results part of big data," said Orenstein. "Companies will put more emphasis on driving value and results from collected data. Enterprises see value in co-locating transactions and analytics for modern, real-time applications. Business demand for such capabilities has existed for a long time, but could not be met because of limitations in the technology required to offload data from the transactional systems to data marts or data warehouses. 2018 is the year in which these limitations evaporate through comprehensive distributed systems able to meet multiple processing needs.”

This will support advances in at least one real-world use-case, according to MemSQL: public safety and law enforcement.

“Surveillance camera systems will be able to trace faces through timelines to the point of origin assisting with crime fighting. National facial recognition system, similar to fingerprints but easily collected via publicly available DMV records, will gain traction by using 3D cameras for face capture. This will lead to greater unification of federal and local governmental police and criminal records. Mining past crime and reported crimes could be used to predict future events and provide individualized scoring for criminal behavior.”

EnterpriseAI