Let’s take a brief look at the history of GPUs before machine learning, and their current status in machine learning applications.
Seven years ago, Marc Andreessen wrote his now-infamous Wall Street Journal op-ed, “Why Software is Eating the World,” ushering in the beginning of a modern, software-driven economy. It’s taken a while for machine learning to catch up to this this trend. For the last seven years, machine learning has been primarily focused on building hardware… Read More Why Software Will Eat the Machine Learning World
Throughout history, there have been two ways of solving problems: work within limits, or find ways to overcome them. Today, we seem stuck in the “work within limits” phase of AI. While plenty of exciting work is being done in the field of deep learning, our ability to address real-world problems is still constrained by… Read More Welcome to Limitless AI
Or other domain-specific chipsets, for that matter? In the machine learning inference phase, training is complete and it’s time for a model to do its job: make predictions based on the incoming data. In other words, the model has learned all of the “assumptions” it needs to know to make predictions for the task at… Read More Machine Learning Inference: Why Use GPUs?