June_AI Chip Topics|AI Training Chip Trend Analysis for One's Cloud(Next)
Chip Types
Since its development in the 1950s, AI technology has gone through the stages of symbolic logic, expert systems, machine learning, etc. In 2012, AlexNet won the ImageNet championship with its deep learning model, which opened up a new era of AI. Later, RNN, GAN and other architectures appeared one after another, and the Transformer developed by Google even became the basis of today's Large Language Model (LLM). These models are generally characterized by a large number of neurons, weights and layers, a high degree of complexity, and a core of Multiply-Accumulate-Collective (MAC) algorithms. Model training starts from the initial weights, and then back-corrects hundreds of millions of weights through inference, requiring thousands to tens of thousands of iterations before convergence. The huge demand for data computation has given rise to AI-specific hardware, called neural network processors (NPUs), to accelerate computation.