What simply occurred? Microsoft has launched BitNet b1.58 2B4T, a brand new kind of enormous language mannequin engineered for distinctive effectivity. Not like standard AI fashions that depend on 16- or 32-bit floating-point numbers to symbolize every weight, BitNet makes use of solely three discrete values: -1, 0, or +1. This method, generally known as ternary quantization, permits every weight to be saved in simply 1.58 bits. The result’s a mannequin that dramatically reduces reminiscence utilization and may run much more simply on customary {hardware}, with out requiring the high-end GPUs sometimes wanted for large-scale AI.
Support Greater and Subscribe to view content
This is premium stuff. Subscribe to read the entire article.