Advanced Computing in the Age of AI|Wednesday, January 22, 2020
  • Subscribe to EnterpriseAI Weekly Updates:  Subscribe by email

Quantum Computing, ML Drive 2019 Patent Awards

Quantum Computing, ML Drive 2019 Patent Awards

The dizzying pace of technology innovation often fueled by the growing availability of computing horsepower is underscored ...

The Rush to 5G Begins, Driven by IoT

The Rush to 5G Begins, Driven by IoT

The number of 5G wireless connections is forecast to soar 150 percent over the next five years ...

Nvidia Launches Autonomous Vehicle, Conversational AI Tech at GTC China

Nvidia Launches Autonomous Vehicle, Conversational AI Tech at GTC China

Nvidia launched a raft of new autonomous driving- and conversational AI-related products today at its GTC China ...

Intel’s Habana Deal Expands AI Reach to Edge

Intel’s Habana Deal Expands AI Reach to Edge

Intel continues to expand its push into AI silicon with the acquisition of Israeli-based Habana Labs, a ...

Happening Now
Tuesday, January 21 Friday, January 17 Thursday, January 16 More Happening Now

Silicon

January 14, 2020
The dizzying pace of technology innovation often fueled by the growing availability of computing horsepower is underscored by the race to develop unique designs and application that can be patented. Among the goals of many of the companies we track is building up their intellectual property portfolios to provide a steady stream of licensing revenue when markets turn south. ... Full article
January 13, 2020
The number of 5G wireless connections is forecast to soar 150 percent over the next five years as infrastructure rollouts gather momentum for connecting sensors and other edge devices. The forecast released this week by Juniper Research estimates the total number of 5G connections will jump from about 5 million subscribers in 2019 to a staggering 1.5 billion globally ... Full article
December 16, 2019
Intel continues to expand its push into AI silicon with the acquisition of Israeli-based Habana Labs, a specialist in deep learning accelerators for datacenters—a capability Intel hopes to extend to the network edge. The acquisition announced by Intel on Monday (Dec. 16) is valued at about $2 billion. Additional terms of the purchase were not disclosed. Intel Capital was ... Full article
December 12, 2019
Change within Intel’s upper management – and to its company mission – has continued as a published report has disclosed that chip technology heavyweight Gary Patton, GlobalFoundries’ CTO and R&D SVP as well as former IBM VP of semiconductor R&D, has joined Intel as corporate VP and GM of design enablement. He will report to CTO Michael Mayberry. The ... Full article
December 11, 2019
Cisco Systems unveiled its networking framework of the future this week aimed at scaling internet performance for emerging enterprise workloads like AI and machine learning along with network routers for Internet of Things and other 5G-driven applications and services. The networking giant (NASDAQ: CSCO) announced a programmable silicon architecture as the foundation of a framework that includes the “un-bundling” ... Full article
December 10, 2019
Designed to push the frontiers of computing chip and systems performance optimized for AI workloads, an 8 petaflop IBM Power9-based supercomputer has been unveiled in upstate New York that will be used by IBM data and computer scientists, by academic researchers and by industrial and commercial end-users. Installed at the Rensselaer Polytechnic Institute Center for Computational Innovations (CCI), the ... Full article
December 9, 2019
An infrastructure vendor zeroing in on AI and machine learning workloads has added GPU support to a Kubernetes-based appliance designed to handle those emerging applications in containers while adding GPU horsepower to scale in-house workloads to the cloud. Diamanti, a five-year-old startup that recently closed a $35 million funding round, unveiled this week what it claims is the first ... Full article
December 3, 2019
The “x86 Big Bang,” in which market dominance of the venerable Intel CPU has exploded into fragments of processor options suited to varying workloads, has now encompassed CPUs offered by the leading public cloud services provider, AWS, which today announced it has further invested in its Graviton Arm-based chip for general purpose, scale-out workloads, including containerized microservices, web servers ... Full article
November 15, 2019
Graphcore, the U.K. AI chip developer, is expanding collaboration with Microsoft to offer its intelligent processing units on the Azure cloud, making Microsoft the first large public cloud vendor to offer the IPU designed for machine learning workloads. Azure support for IPUs is the culmination of more than two years of collaboration between the software giant (NASDAQ: MSFT) and ... Full article
November 13, 2019
Tencent, the Chinese cloud giant, said it would use AMD’s newest Epyc processor in its internally-designed server. The design win adds further momentum to AMD’s bid to erode rival Intel Corp.’s dominance of the global cloud and datacenter server markets. The partners announced this week that Tencent Cloud’s new servers will implement AMD’s “Star Lake” platform based on the ... Full article
November 13, 2019
AI and HPC are increasingly intertwined – machine learning workloads demand ever increasing compute power – so it’s no surprise the annual supercomputing industry shindig, SC19 at the Colorado Convention Center in Denver next week, has taken on a strong AI cast. As we noted recently (“Machine Learning Fuels a Booming HPC Market”) based on findings by industry watcher ... Full article
November 12, 2019
At its AI Summit today in San Francisco, Intel touted a raft of AI training and inference hardware for deployments ranging from cloud to edge and designed to support organizations at various points of their AI journeys. The company revealed its Movidius Myriad Vision Processing Unit (VPU), codenamed “Keem Bay,” for edge media, computer vision and inference applications. The ... Full article
November 7, 2019
MLPerf.org, the young AI-benchmarking consortium, has issued the first round of results for its inference test suite. Among organizations with submissions were Nvidia, Intel, Alibaba, Supermicro, Google, Huawei, Dell and others. Not bad considering the inference suite (v.5) itself was just introduced in June. Perhaps predictably, GPU powerhouse Nvidia quickly claimed early victory issuing a press release coincident with the ... Full article
November 6, 2019
Nvidia has launched what it claims to be the world’s smallest supercomputer, an addition to its Jetson product line with a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) of throughput, according to the company. The Jetson Xavier NX module consumes as little as 10 watts of power, costs $399 and is designed to be ... Full article
October 29, 2019
A potential interim step between conventional semiconductors and quantum devices has emerged, promising improved information processing schemes that outperform current electronic charge- and spin-based chip architectures. The emerging quantum process dubbed “valleytronics” focuses on low energy “valleys” or extremes in the electronic band structure of semiconductors. Those valleys of electrons can be used to encode, process and store information, ... Full article
October 16, 2019
GPUs are famously expensive – high end Nvidia Teslas can be priced well above $10,000. Now a New York startup, Paperspace, has announced a free cloud GPU service for machine/deep learning development on the company’s cloud computing and deep learning platform. Designed for students and professional learning how to build, train and deploy machine learning models, the service can ... Full article
October 15, 2019
Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ll get there at last month’s MIT-IBM Watson AI Lab’s AI Research Week held at MIT. Just as Moore’s law, now fading, was always a metric with many ingredients baked into it, Gil’s ... Full article
October 15, 2019
The ability to share and analyze data while protecting patient privacy is giving medical researchers a new tool in their efforts to use what one vendor calls “federated learning” to train models based on diverse data sets. To that end, researchers at GPU leader Nvidia (NASDAQ: NVDA) working with a team at King’s College London came up with a ... Full article
October 9, 2019
An HPC cluster with deep learning techniques will be used to process petabytes of scientific data as part of workload-intensive projects spanning astrophysics to genomics. AI partners Intel (NASDAQ: INTC) and Lenovo (OTCMKTS: LNVGY) said they are providing the Flatiron Institute of New York City with high-end servers running on Intel second-generation Xeon Scalable processors and the chip maker’s ... Full article
October 1, 2019
DARPA will seek to unclog the networking bottlenecks that are hindering wider use of powerful hardware in computing-intensive applications. The Pentagon research agency has unveiled another in a series of post-Moore’s Law computing initiatives, this one seeking an overhaul of the network stack and interfaces that fall well short of connecting high-end processors with external networks and the data-driven applications ... Full article

More articles

Do NOT follow this link or you will be banned from the site!
Share This