Search results
bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU (with NPU and GPU support coming next).
28 lut 2024 · Recent research, such as BitNet, is paving the way for a new era of 1-bit Large Language Models (LLMs). In this work, we introduce a 1-bit LLM variant, namely BitNet b1.58, in which every single parameter (or weight) of the LLM is ternary {-1, 0, 1}.
bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU (with NPU and GPU support coming next).
This is a reproduction of the BitNet b1.58 paper. The models are trained with RedPajama dataset for 100B tokens. The hypers, as well as two-stage LR and weight decay, are implemented as suggested in their following paper. All models are open-source in the repo.
29 mar 2024 · Here is the commands to run the evaluation: pip install lm-eval==0.3.0. python eval_ppl.py --hf_path 1bitLLM/bitnet_b1_58-3B --seqlen 2048. python eval_task.py --hf_path 1bitLLM/bitnet_b1_58-3B \ --batch_size 1 \ --tasks \ --output_path result.json \ --num_fewshot 0 \ --ctx_size 2048.
24 cze 2024 · In this work, we investigate 1.58-bit quantization for small language and vision models ranging from 100K to 48M parameters. We introduce a variant of BitNet b1.58, which allows to rely on the median rather than the mean in the quantization process.
26 mar 2024 · Unlike its predecessor, BitNet b1.58 replaces the conventional nn.Linear layers with BitLinear layers, leveraging 1.58-bit weights and 8-bit activations.