Hugging Face Hub Library Reaches Version 1.0 After Five Years of Development

The huggingface_hub library has officially released version 1.0, marking five years of continuous development. The milestone signifies the library's maturity and stability, establishing it as a core component within the Python ecosystem. It currently supports 200,000 dependent libraries and facilitates access to over 2 million public models, 500,000 public datasets, and 1 million Space applications.
This update incorporates changes designed to support the open-source machine learning ecosystem for the next decade. Nearly 300 contributors and millions of users have driven its evolution. The developers recommend upgrading to v1.0 for improved performance and new features.
Key updates in this major version include the adoption of httpx as the new backend request library and a redesigned hf command-line tool. The new CLI, built with Typer, replaces the deprecated huggingface-cli and offers expanded functionality. Additionally, all file transfers have been migrated to hf_xet, phasing out the older hf_transfer tool.
While v1.0.0 is largely compatible with older versions, particularly for most machine learning libraries, the transformers library presents a notable exception. Version 4 of transformers depends on huggingface_hub v0.x, while the upcoming v5 will transition to v1.x.
Evolution of huggingface_hub
The huggingface_hub library originated from the idea of simplifying the sharing of machine learning models. In its early stages, sharing models often involved cumbersome methods like unstable Google Drive links, leading to resource duplication and inefficient collaboration. The Hugging Face Hub was created to address this by providing a platform for sharing and hosting model checkpoints compatible with the transformers library.
Initially, the Python logic for interacting with the Hub was embedded within the transformers library. In late 2020, huggingface_hub v0.0.1 was launched to separate this internal logic, creating a dedicated library for unified access and sharing of machine learning models and datasets on the Hugging Face Hub. The earliest version primarily functioned as a wrapper for Git operations, managing file downloads and repositories. Over five years and more than 35 version iterations, huggingface_hub has expanded significantly beyond its initial scope.
Foundational Development and API Expansion
Early versions of huggingface_hub established core functionalities. Version 0.0.8 introduced the first API, encapsulating Git commands for model repository interaction. Version 0.0.17 added token-based authentication, enabling access to private repositories and secure content uploads.
A significant shift occurred in June 2022 with version 0.8.1, which introduced the HTTP Commit API. This allowed users to upload files directly via HTTP without requiring Git and Git LFS installations. The create_commit() API simplified the upload process, particularly for large model files. This version also implemented a Git-aware caching mechanism, allowing all libraries using huggingface_hub to share a common caching system with version control and file deduplication. This marked a transition from a Git tool for transformers to an infrastructure for the broader machine learning ecosystem.
As the Hugging Face Hub evolved into a comprehensive platform, huggingface_hub's API capabilities expanded to support various use cases. Core repository operations matured to include listing file trees, browsing references and commit histories, reading files, synchronizing folders, managing tags, branches, and release cycles, and querying repository metadata. The library also gained full programmatic management capabilities for Hugging Face Spaces, including hardware resource requests, environment configuration, secret management, and file uploads. Support for Inference Endpoints and the Jobs API further enhanced computing service capabilities.
Community and social features were also integrated, enabling the creation and management of Pull Requests and comments, querying user and organization information, and managing repository likes, follows, and collections. User experience improvements included seamless authentication in Colab, enhanced reliability for large folder uploads, and resumable downloads.
Version v0.28.0 introduced the Inference Provider Ecosystem, allowing users to call multiple serverless inference platforms, such as Together AI, SambaNova, Replicate, Cerebras, and Groq, through a unified API with transparent routing and pay-per-request billing.
Xet Integration and Ecosystem Impact
Version v0.30.0 introduced Xet, a Git large file storage protocol that optimizes data deduplication and transfer at a 64KB block level. This means that when large model or data files are updated, only the changed parts are uploaded or downloaded. A large-scale migration to the Xet backend involved over 500,000 repositories and 20PB of data, transparently and without interruption to existing workflows. A year later, over 6,000,000 repositories and 77PB of data have been migrated to Xet.
The huggingface_hub library records 113.5 million monthly downloads and over 1.6 billion cumulative downloads as of October 2025. It provides access to over 2 million public models, 500,000 public datasets, and 1 million public Spaces. The platform sees over 60,000 daily active users and over 550,000 monthly active users. More than 200,000 enterprises globally use the library.
huggingface_hub serves as a core dependency for over 200,000 GitHub repositories and 3,000 PyPI packages, integrating with frameworks such as Keras, LangChain, PaddleOCR, ChatTTS, YOLO, Google Generative AI, Moshi, NVIDIA NeMo, and Open Sora. Hugging Face's own ecosystem, including transformers, diffusers, datasets, and gradio, also relies on it.
Architectural Changes and Future Focus
Version 1.0 includes strategic updates to prepare huggingface_hub for future AI developments. A key architectural change is the migration of the underlying HTTP request library from requests to httpx. This upgrade provides native HTTP/2 support, full thread safety, and a unified synchronous and asynchronous interface. For developers using custom HTTP backends, a migration path is provided through set_client_factory() or set_async_client_factory(). The hf_xet binary toolkit is now the default for file uploads and downloads, completely replacing hf_transfer.
Version 0.32.0 introduced Model Context Protocol (MCP) integration and the tiny-agents toolchain, simplifying AI Agent development. MCPClient offers a standardized interface for AI Agents to interact with various tools, and the tiny-agents CLI tool allows launching Agents directly from the Hub. These features build upon the existing InferenceClient and its supported inference providers.
The Hugging Face CLI tool has evolved into a full-featured machine learning operations interface. The redesigned hf command replaces huggingface-cli with a "resource-action" pattern for tasks such as authentication (hf auth login), file management (hf download, hf upload), repository management (hf repo), cache management (hf cache ls, hf cache rm), and cloud computing tasks (hf jobs run). The CLI offers a sandboxed installer and supports command auto-completion.
Version 1.0 also addresses technical debt by removing older features and usages that hindered future development. The Git-based Repository class has been removed, with HTTP interfaces like upload_file() and create_commit() becoming the standard. HfFolder's token management has been replaced by explicit login(), logout(), and get_token() functions. The InferenceApi class has been superseded by the InferenceClient. The hf_transfer tool has been fully replaced by hf_xet. These changes were preceded by deprecation notices and migration guides.
A migration guide provides step-by-step instructions for transitioning to v1.0. Backward compatibility has been maintained where possible; for instance, HfHubHttpError inherits from both requests and httpx HTTPError exception classes to ensure existing error handling logic functions.
From v1.0 onward, development will focus on the new version, with older versions (v0.*) receiving only security patch updates.