Tether’s Data and AI division QVAC announced a major technological breakthrough on March 17, launching the world’s first cross-platform LoRA fine-tuning framework supporting Microsoft’s BitNet (1-bit LLM) architecture. This technology integrated into QVAC Fabric significantly reduces memory and computational requirements, making billion-parameter models no longer exclusive to enterprise-level GPUs, enabling “local, fully private” training on ordinary smartphones and laptops.
(Background: Tether invests in Axiym to expand payment infrastructure: promoting USDT integration into global compliant payment networks)
(Additional context: Tether crosses into AI sleep technology! Leading a $50 million investment in Eight Sleep, valuation surges to $1.5 billion)
Table of Contents
Toggle
In the field of artificial intelligence (AI), training powerful models has long been considered “money-burning,” heavily relying on expensive NVIDIA systems or cloud computing. However, stablecoin giant Tether is trying to rewrite this rule with technology. Tether’s technical arm, “Tether Data,” announced on March 17 the launch of the world’s first cross-platform BitNet LoRA fine-tuning framework for its QVAC (QuantumVerse Automatic Computer) platform.
The core value of this technology is that it allows AI models with “billion-parameter” scale to perform personalized learning directly on smartphones in users’ pockets.
This breakthrough is based on Microsoft’s BitNet 1-bit LLM architecture. Through optimizations in QVAC Fabric, the memory footprint and computational load of BitNet models are reduced to extremely low levels. According to the announcement, the framework supports not only common NVIDIA GPUs but also achieves full compatibility with Intel, AMD, Apple M-series chips, and mobile GPUs such as Adreno (Android), Mali, and Apple Bionic.
This means AI that previously could only run in data centers can now be fine-tuned on your phone using “Low-Rank Adaptation (LoRA).” Tether states that this technology enables edge devices to handle models twice as large as traditional Q4 quantized models, demonstrating extreme memory efficiency.
Tether’s engineering team shared exciting real-world data showcasing the framework’s capabilities on modern smartphones:
Tether CEO Paolo Ardoino has emphasized: “If you need an API key to use AI, then it doesn’t truly belong to you.” The core philosophy of QVAC is “Local-first.”
With the BitNet LoRA framework, users can let AI learn directly from local emails, notes, and messages without uploading any data to cloud servers. This not only alleviates concerns over sensitive data misuse but also breaks the monopoly of AI development by a few giants. Currently, QVAC Fabric LLM is released as open-source software (Apache 2.0 license), with pre-configured adapters available on Hugging Face, enabling developers worldwide to immediately start this edge computing revolution.