GPU Speeds without GPUs: Announcing the Neural Magic Inference Engine

11/07/19
We are proud to announce the first version of the Neural Magic Inference Engine, offering GPU-class performance on commodity CPUs.

Neural Magic Announces $15 Million in Seed Funding from Comcast Ventures, NEA, Andreessen Horowitz, Pillar VC and Amdocs

11/06/19
The seed investment is led by Comcast Ventures, and including NEA, Andreessen Horowitz, Pillar VC and Amdocs

What is the Infrastructure Phase of Machine Learning?

09/25/19
A Union Square Ventures post from last fall called out the fact that throughout the course of technology history, apps beget the infrastructure that supports them. The concept of an “app” is loosely defined as something that directly touches the end-user (examples included light bulbs, planes, email, etc.) In the case of web applications, this… Read More What is the Infrastructure Phase of Machine Learning?

Data Scientists Make These 3 Common Mistakes

09/17/19
The practice of training and putting machine learning systems into production involves a lot of trial and error. However, there are some recurring mistakes many data scientists make that can be avoided with the proper awareness and preparation. Here are a few that we’ve seen relatively frequently, and some tips on how to prevent these… Read More Data Scientists Make These 3 Common Mistakes

Why Machine Learning Needs Academia and Industry to Survive

09/12/19
The field of machine learning has progressed quickly in the last decade, and there’s a big reason why: both academics and industry groups are working on the problem, often in competition with one another. Much ink has been spilled over the fierce rivalry for talent between corporate and academic research labs, but I’d argue that… Read More Why Machine Learning Needs Academia and Industry to Survive

Using CNNs for Inference

09/04/19
Convolutional neural networks (CNNs) are a type of neural network most often used for image recognition and classification. CNNs excel at these tasks because they are designed to automatically learn how to recognize spatial hierarchies in an image. Once these algorithms are trained, they can ‘infer’ the next best prediction for the task at hand. … Read More Using CNNs for Inference

A Brief History of GPUs

08/06/19
Let’s take a brief look at the history of GPUs before machine learning, and their current status in machine learning applications.

Why Software Will Eat the Machine Learning World

08/06/19
Seven years ago, Marc Andreessen wrote his now-infamous Wall Street Journal op-ed, “Why Software is Eating the World,” ushering in the beginning of a modern, software-driven economy. It’s taken a while for machine learning to catch up to this this trend. For the last seven years, machine learning has been primarily focused on building hardware… Read More Why Software Will Eat the Machine Learning World

Welcome to Limitless AI

08/06/19
Throughout history, there have been two ways of solving problems: work within limits, or find ways to overcome them. Today, we seem stuck in the “work within limits” phase of AI. While plenty of exciting work is being done in the field of deep learning, our ability to address real-world problems is still constrained by… Read More Welcome to Limitless AI

Machine Learning Inference: Why Use GPUs?

08/06/19
Or other domain-specific chipsets, for that matter? In the machine learning inference phase, training is complete and it’s time for a model to do its job: make predictions based on the incoming data. In other words, the model has learned all of the “assumptions” it needs to know to make predictions for the task at… Read More Machine Learning Inference: Why Use GPUs?