Tether’s QVAC Pushes Multi‑billion‑parameter AI Models Onto Phones and Consumer GPUs
13 Articles
13 Articles
Tether’s QVAC Introduces Cross-Platform Bitnet LoRA Framework for On-Device AI Training
Tether has announced the release of a new framework designed to enable the training and inference of large language models on consumer-grade hardware. Tether’s QVAC Launches World’s First Cross-Platform BitNet LoRA Framework to Enable Billion-Parameter AI Training and Inference on Consumer GPUs and Smartphones Learn more: https://t.co/8ygOFzhfjn— Tether (@tether) March 17, 2026 The system, developed under its QVAC Fabric initiative, introduces w…
Tether QVAC AI Enables On-Device AI With Billion-Parameter Models
This article was first published on Deythere. The latest Tether QVAC AI development brings artificial intelligence even closer to everyday users, as the company has introduced a system that permits large models to operate directly on smartphones and consumer hardware. Introduced with Tether’s growing AI team, Tether QVAC AI also removes the need for cloud-based processing. Instead; it enables training and inference directly on the device, smartp…
Tether Unveils Cross-Platform BitNet LoRA AI System
Tether unveiled its QVAC BitNet LoRA framework for cross-platform AI training. Developers can fine-tune billion-parameter models without costly cloud infrastructure. Benchmarks show efficient training with a 125M model in 10 mins on a Samsung smartphone. Tether launched a new artificial intelligence framework designed to run large AI models on everyday devices. Tether unveiled the technology as part of its QVAC Fabric system. This framework enab…
Tether’s QVAC pushes multi‑billion‑parameter AI models onto phones and consumer GPUs
Tether’s QVAC Fabric integrates BitNet LoRA to fine‑tune and run multi‑billion‑parameter AI models on consumer GPUs and flagship phones, pushing serious AI work to the edge. Tether’s AI division has quietly shipped one of its most aggressive non‑stablecoin bets to…
Tether’s QVAC Launches BitNet LoRA Framework to Run Billion-Parameter AI on Consumer Devices
TLDR: Tether’s QVAC Fabric introduces the world’s first cross-platform LoRA fine-tuning for BitNet models. A 1B-parameter model can be fine-tuned on a Samsung S25 in just 1 hour and 18 minutes on-device. BitNet-1B uses up to 77.8% less VRAM than Gemma-3-1B, cutting memory needs across consumer hardware. The framework extends LoRA fine-tuning beyond NVIDIA to AMD, Intel, Apple Silicon, and mobile GPUs. BitNet LoRA Framework development has reach…
Coverage Details
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
Factuality
To view factuality data please Upgrade to Premium



