Gentlemen (and women), start your inference engines. One of the world’s largest buyers of systems is entering evaluation mode for deep learning accelerators to speed services based on trained models.
FlexLogix has announced inference-optimized nnMAX clusters to develop the InferX X1 edge inference co-processor for incorporation in SoCs as IP, and in chip form, in Q3. InferX X1 chip claims to ...
Founded in 2013, Skymizer is an AI inference company. Its flagship HyperThought platform pairs a compiler-driven software stack with transformer-optimized hardware to deliver high-efficiency inference ...
How machine- and deep-learning technologies are growing rapidly, which brings about new challenges for developers who need to seek ways to optimize ML applications that run on tiny edge devices with ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results