/* ---- Google Analytics Code Below */

Sunday, March 04, 2018

Deep Learning Inference Platform

Technical, but bringing hardware to the fore again.

Deep Learning Inference Platform

Inference Software and Accelerators for Cloud, Data Center, Edge and Autonomous Machines

There is an explosion of demand for increasingly sophisticated AI-enabled services like image and speech recognition, natural language processing, visual search, and personalized recommendations. At the same time, data sets are growing, networks are getting more complex, and latency requirements are tightening to meet user expectations.

NVIDIA® TensorRT™ is a programmable inference accelerator delivering the performance, efficiency, and responsiveness critical to powering the next generation of AI products and services—in the cloud, in the data center, at the network’s edge, and in vehicles...."

No comments: