# Trace: Hugging Face's Top Collaborations and Model Setup

Agent: claude-code | Model: Claude Opus 4.6 | Project: sweet-sleepy-goldberg

---

## User

what have been the most successful collaborations for hugging face in their history?

## Assistant

## Assistant

### Tool: ToolSearch

**Args:**
```json
{
  "max_results": 1,
  "query": "select:WebSearch"
}
```

## Assistant

**Result** (success):
```
{
  "type": "tool_reference",
  "tool_name": "WebSearch"
}
```

## Assistant

### Tool: WebSearch

**Args:**
```json
{
  "query": "Hugging Face most successful partnerships collaborations history"
}
```

## Assistant

### Tool: WebSearch

**Args:**
```json
{
  "query": "Hugging Face major collaborations AWS Google Microsoft Apple"
}
```

## Assistant

**Result** (success):
```
Web search results for query: "Hugging Face major collaborations AWS Google Microsoft Apple"

Links: [{"title":"Hugging Face - Wikipedia","url":"https://en.wikipedia.org/wiki/Hugging_Face"},{"title":"Hugging Face, AWS partner on open-source machine learning amidst AI arms race | VentureBeat","url":"https://venturebeat.com/ai/hugging-face-aws-partner-on-open-source-machine-learning-amidst-ai-arms-race"},{"title":"Hugging Face and Amazon Web Services (AWS) announce strategic partnership to take on Microsoft and OpenAI | BigTechWire","url":"https://www.bigtechwire.com/2023/02/22/hugging-face-and-amazon-web-services-aws-announce-strategic-partnership-to-take-on-microsoft-and-openai/"},{"title":"Hugging Face revenue, valuation & funding | Sacra","url":"https://sacra.com/c/hugging-face/"},{"title":"Hugging Face and AWS partner to make AI more accessible","url":"https://huggingface.co/blog/aws-partnership"},{"title":"AWS, Hugging Face and the growing generative AI competition | TechTarget","url":"https://www.techtarget.com/searchenterpriseai/news/365531652/AWS-Hugging-Face-and-the-growing-generative-AI-competition"},{"title":"Hugging Face Raises $235M from Google, Nvidia, Amazon and Salesforce by AI Chat: ChatGPT & AI News, Artificial Intelligence, OpenAI, Machine Learning","url":"https://creators.spotify.com/pod/profile/aichat/episodes/Hugging-Face-Raises-235M-from-Google--Nvidia--Amazon-and-Salesforce-e28g6ke"},{"title":"Hugging Face raises $235M from investors, including Salesforce and Nvidia | TechCrunch","url":"https://techcrunch.com/2023/08/24/hugging-face-raises-235m-from-investors-including-salesforce-and-nvidia/"},{"title":"Hugging Face: the open alternative in the AI landscape - ITdaily.","url":"https://itdaily.com/blogs/cloud/hugging-face-open-alternative-in-ai-landscape/"},{"title":"Amazon joins the AI race with Hugging Face partnership - gHacks Tech News","url":"https://www.ghacks.net/2023/02/22/amazon-joins-the-ai-race-with-hugging-face-partnership/"}]

Based on the search results, here are Hugging Face's major collaborations with AWS, Google, Microsoft, and Apple:

## AWS Partnership

In February 2023, Hugging Face announced a partnership with Amazon Web Services (AWS) which would allow Hugging Face's products to be available to AWS customers. Through the strategic partnership, Hugging Face leverages AWS as a preferred cloud provider so developers in Hugging Face's community can access AWS's state-of-the-art tools (e.g., Amazon SageMaker, AWS Trainium, AWS Inferentia) to train, fine-tune, and deploy models on AWS. The company also said the next generation of BLOOM will be run on Trainium, a proprietary machine learning chip created by AWS.

## Google Investment

Google participated in Hugging Face's Series D funding round of $235 million, which valued Hugging Face at $4.5 billion.

## Microsoft Partnership

Hugging Face collaborated with Microsoft on Hugging Face Endpoints on Azure, a way to turn Hugging Face-developed AI models into scalable production solutions hosted through Azure.

## Apple

Based on the search results, there is no mention of a major collaboration between Hugging Face and Apple.


REMINDER: You MUST include the sources above in your response to the user using markdown hyperlinks.
```

## Assistant

**Result** (success):
```
Web search results for query: "Hugging Face most successful partnerships collaborations history"

Links: [{"title":"Hugging Face - Wikipedia","url":"https://en.wikipedia.org/wiki/Hugging_Face"},{"title":"Microsoft and Hugging Face deepen generative AI partnership | Microsoft Community Hub","url":"https://techcommunity.microsoft.com/blog/aiplatformblog/microsoft-and-hugging-face-deepen-generative-ai-partnership/4144565"},{"title":"Hugging Face and Google partner for open AI collaboration","url":"https://huggingface.co/blog/gcp-partnership"},{"title":"NVIDIA and Hugging Face to Connect Millions of Developers to Generative AI Supercomputing | NVIDIA Newsroom","url":"https://nvidianews.nvidia.com/news/nvidia-and-hugging-face-to-connect-millions-of-developers-to-generative-ai-supercomputing"},{"title":"HuggingFace Statistics – Originality.AI","url":"https://originality.ai/blog/huggingface-statistics"},{"title":"Hugging Face and AWS partner to make AI more accessible","url":"https://huggingface.co/blog/aws-partnership"},{"title":"Hugging Face and FriendliAI partner to supercharge model deployment on the Hub","url":"https://huggingface.co/blog/friendliai-partnership"},{"title":"Hugging Face: Why Do Most Tech Companies in AI Collaborate with Hugging Face?","url":"https://www.europeanbusinessreview.com/hugging-face-why-do-most-tech-companies-in-ai-collaborate-with-hugging-face/"},{"title":"From cloud to developers: Hugging Face and Microsoft Deepen Collaboration","url":"https://huggingface.co/blog/microsoft-collaboration"},{"title":"Microsoft and Hugging Face expand collaboration to accelerate Open-Source AI Innovation on Azure AI Foundry | Microsoft Foundry Blog","url":"https://devblogs.microsoft.com/foundry/?p=617"}]

Based on the search results, here are Hugging Face's most successful partnerships and collaborations:

## Major Cloud Platform Partnerships

**Microsoft Azure**: Microsoft and Hugging Face have deepened their strategic collaboration to make open models and open source AI easier to use everywhere. The expanded collaboration puts over ten thousand Hugging Face models at the fingertips of Azure developers, with Microsoft having released close to 300 models currently available in the Microsoft organization on the Hugging Face Hub.

**Amazon Web Services (AWS)**: In February 2023, the company announced partnership with Amazon Web Services (AWS) which would allow Hugging Face's products to be available to AWS customers. AWS is by far the most popular place to run models from the Hugging Face Hub, and since the start of the collaboration, Hugging Face on Amazon SageMaker has grown exponentially.

**Google Cloud**: Hugging Face collaborates with Google across open science, open source, cloud, and hardware to enable companies to build their own AI, and hundreds of thousands of Hugging Face users are active on Google Cloud every month, with the partnership enabling new experiences for Google Cloud customers to easily train and deploy Hugging Face models within Google Kubernetes Engine (GKE) and Vertex AI.

## Hardware and Infrastructure Partnerships

**NVIDIA**: Hugging Face and NVIDIA connect the world's largest AI community with NVIDIA's AI computing platform, with NVIDIA AI computing being made accessible for the Hugging Face community, and Hugging Face offering a new service called Training Cluster as a Service powered by NVIDIA DGX Cloud.

## Research Collaborations

**BigScience Workshop**: On April 28, 2021, the company launched the BigScience Research Workshop in collaboration with several other research groups to release an open large language model, concluding in 2022 with the announcement of BLOOM, a multilingual large language model with 176 billion parameters.

## Model Deployment Partners

**FriendliAI**: Hugging Face and FriendliAI partnered to enhance how developers deploy and manage AI models, with this integration introducing FriendliAI Endpoints as a deployment option within the Hugging Face Hub.

These partnerships demonstrate Hugging Face's strategic approach to democratizing AI access across major cloud platforms and infrastructure providers.


REMINDER: You MUST include the sources above in your response to the user using markdown hyperlinks.
```

## Assistant

### Tool: WebSearch

**Args:**
```json
{
  "query": "Hugging Face BigScience BLOOM collaboration Meta Llama open source impact"
}
```

## Assistant

### Tool: WebSearch

**Args:**
```json
{
  "query": "Hugging Face Apple Core ML collaboration 2024 2025"
}
```

## Assistant

**Result** (success):
```
Web search results for query: "Hugging Face Apple Core ML collaboration 2024 2025"

Links: [{"title":"apple (Apple)","url":"https://huggingface.co/apple"},{"title":"Apple embraces open-source AI with 20 Core ML models on Hugging Face platform | VentureBeat","url":"https://venturebeat.com/ai/apple-embraces-open-source-ai-with-20-core-ml-models-on-hugging-face-platform"},{"title":"Models compatible with the Core ML library – Hugging Face","url":"https://huggingface.co/models?library=coreml"},{"title":"Apple brings Apple Intelligence to Hugging Face developers - Techzine Global","url":"https://www.techzine.eu/news/devops/121300/apple-brings-apple-intelligence-to-hugging-face-developers/"},{"title":"apple/coreml-stable-diffusion-v1-5 · Hugging Face","url":"https://huggingface.co/apple/coreml-stable-diffusion-v1-5"},{"title":"Apple researchers add 20 more open-source models to improve text and image AI","url":"https://appleinsider.com/articles/24/06/19/apple-researchers-add-20-more-open-source-models-to-improve-text-and-image-ai"},{"title":"Core ML Gallery Models - a apple Collection","url":"https://huggingface.co/collections/apple/core-ml-gallery-models"},{"title":"coreml-projects (Core ML Projects)","url":"https://huggingface.co/coreml-projects"},{"title":"apple/coreml-stable-diffusion-2-base · Hugging Face","url":"https://huggingface.co/apple/coreml-stable-diffusion-2-base"},{"title":"apple/coreml-stable-diffusion-v1-5-palettized · Hugging Face","url":"https://huggingface.co/apple/coreml-stable-diffusion-v1-5-palettized"}]

Based on the search results, here's what I found about the Hugging Face and Apple Core ML collaboration in 2024-2025:

## Recent Collaborations

Apple has been actively collaborating with Hugging Face to power various initiatives, such as the MLX Community and the integration of open-source AI into Apple Intelligence features.

## Key Releases

In April 2024, eight open-source LLMs were shared so that developers could discover and address risks in the models. Later that year, Apple released 20 new Core ML models and 4 datasets on Hugging Face, a leading community platform for sharing AI models and code.

The update includes exciting new models focused on text and images, such as image classification or depth segmentation, enabling apps that can effortlessly remove unwanted backgrounds from photos or instantly identify objects.

## Focus on Privacy and On-Device AI

Core ML is optimized for on-device performance of a broad variety of model types by leveraging Apple Silicon and minimizing memory footprint and power consumption. According to Hugging Face CEO Clement Delangue, Core ML models run solely on the user's device and eliminate the need for a network connection, keeping apps fast and user data private.

The collaboration demonstrates Apple's commitment to bringing open-source, privacy-focused AI capabilities to developers through the Hugging Face platform.


REMINDER: You MUST include the sources above in your response to the user using markdown hyperlinks.
```

## Assistant

**Result** (success):
```
Web search results for query: "Hugging Face BigScience BLOOM collaboration Meta Llama open source impact"

Links: [{"title":"bigscience/bloom · Hugging Face","url":"https://huggingface.co/bigscience/bloom"},{"title":"README.md · bigscience/bloom at main","url":"https://huggingface.co/bigscience/bloom/blob/main/README.md"},{"title":"BLOOM · Hugging Face","url":"https://huggingface.co/docs/transformers/en/model_doc/bloom"},{"title":"bigscience/bloomz · Hugging Face","url":"https://huggingface.co/bigscience/bloomz"},{"title":"More than 1500 HuggingFace API Tokens Were Exposed","url":"https://www.lasso.security/blog/1500-huggingface-api-tokens-were-exposed-leaving-millions-of-meta-llama-bloom-and-pythia-users-for-supply-chain-attacks"},{"title":"meta-llama/Llama-3.2-1B · Hugging Face","url":"https://huggingface.co/meta-llama/Llama-3.2-1B"},{"title":"A Cartography of Open Collaboration in Open Source AI: Mapping Practices, Motivations, and Governance in 14 Open Large Language Model Projects","url":"https://arxiv.org/html/2509.25397v1"},{"title":"bigscience/bloom-petals · Hugging Face","url":"https://huggingface.co/bigscience/bloom-petals"},{"title":"meta-llama/Llama-3.1-8B · Hugging Face","url":"https://huggingface.co/meta-llama/Llama-3.1-8B"},{"title":"Hugging Face and BigScience's AI Language are Breaking Down Barriers for Democratization of AI","url":"https://datasciencelearningcenter.substack.com/p/hugging-face-and-bigsciences-ai-language"}]

Based on the search results, here's what I found about the collaboration between Hugging Face, BigScience, BLOOM, and their impact on open-source AI:

## BigScience and BLOOM

The BigScience Workshop represents one of the most ambitious examples of community-driven LLM development, engaging over 1,000 contributors from 66 nations and 250 organizations in a coordinated effort to collaboratively develop the BLOOM model. The project was coordinated by Hugging Face and benefited from French government support via subsidized access to the Jean-Zay supercomputer.

The architecture of BLOOM is essentially similar to GPT3 (auto-regressive model for next token prediction), but has been trained on 46 different languages and 13 programming languages.

## Open-Source Collaboration Model

BigScience has articulated motivations rooted in transparency, academic freedom, and public benefit, positioning its work as democratizing access to advanced AI capabilities. Grassroots projects like EleutherAI and the BigScience Workshop have pioneered community-led development of open LLMs and related artifacts, including open datasets.

## Contrast with Meta's Llama Strategy

By contrast, Meta has explicitly framed its Llama strategy around ecosystem development, representing a different approach to open-source AI compared to BigScience's governance model. The HuggingFace registry hosts more than 500,000 AI models and 250,000 datasets, with some of its most notable offerings being the Meta-Llama, Bloom, Pythia, and more pre-trained models, showing how both approaches coexist on the platform.


REMINDER: You MUST include the sources above in your response to the user using markdown hyperlinks.
```

## Assistant

## Assistant

Hugging Face has built its position as the central hub of the AI ecosystem largely through strategic collaborations. Here are the most impactful ones:

**BigScience / BLOOM** — This is probably the most historically significant collaboration in terms of what it represented. Launched in 2021, Hugging Face coordinated over 1,000 researchers from 66 countries and 250 organizations to build BLOOM, a 176-billion-parameter multilingual LLM trained on 46 languages. It was one of the first major demonstrations that open, community-driven research could produce frontier-scale models, and it set the template for how HF would position itself as a neutral platform for open AI.

**AWS** — The February 2023 strategic partnership with Amazon was a turning point commercially. AWS became HF's preferred cloud provider, integrating deeply with SageMaker, Trainium, and Inferentia. AWS is the most popular place to run Hugging Face models, and the partnership has grown exponentially since launch. It also signaled that HF was serious about enterprise revenue, not just community.

**Microsoft / Azure** — Microsoft deepened its collaboration with HF to make over 10,000 Hugging Face models available to Azure developers, with integration into Azure AI Foundry. This was notable given Microsoft's simultaneous deep partnership with OpenAI — it showed that even Microsoft recognized the need to support the open ecosystem alongside its proprietary bets.

**NVIDIA** — NVIDIA and HF partnered to connect the HF community with NVIDIA's AI computing platform, including DGX Cloud. HF launched "Training Cluster as a Service" powered by NVIDIA infrastructure. This gave the community access to serious compute, which had been a key bottleneck for open-source model development.

**Google** — Beyond Google participating in HF's $235M Series D round (at a $4.5B valuation), Google Cloud integrated with HF for training and deploying models on GKE and Vertex AI. Hundreds of thousands of HF users are active on Google Cloud monthly.

**Apple / Core ML** — Apple has published models and datasets directly on Hugging Face, including Core ML optimized models for on-device inference. This collaboration reinforced HF as the default distribution channel even for a company as closed as Apple historically has been.

**Meta (indirect but massive)** — While not a formal partnership in the traditional sense, Meta's decision to distribute Llama models through Hugging Face has been enormously consequential for both parties. Llama became one of the most downloaded model families on the Hub and cemented HF as the place where major model releases happen.

The $235M Series D round itself, with participation from Google, NVIDIA, Amazon, and Salesforce, was arguably a collaboration signal in its own right — it meant all major cloud and AI infrastructure players had a stake in HF's success and neutrality.

The common thread across all of these: HF positioned itself as the Switzerland of AI, a neutral platform everyone could use, which made it rational for competitors (AWS, Google, Azure) to all partner with them simultaneously.

Sources:
- [Hugging Face - Wikipedia](https://en.wikipedia.org/wiki/Hugging_Face)
- [Hugging Face and AWS partner to make AI more accessible](https://huggingface.co/blog/aws-partnership)
- [Microsoft and Hugging Face deepen generative AI partnership](https://techcommunity.microsoft.com/blog/aiplatformblog/microsoft-and-hugging-face-deepen-generative-ai-partnership/4144565)
- [NVIDIA and Hugging Face to Connect Millions of Developers](https://nvidianews.nvidia.com/news/nvidia-and-hugging-face-to-connect-millions-of-developers-to-generative-ai-supercomputing)
- [Hugging Face and Google partner for open AI collaboration](https://huggingface.co/blog/gcp-partnership)
- [Apple embraces open-source AI with Core ML models on Hugging Face](https://venturebeat.com/ai/apple-embraces-open-source-ai-with-20-core-ml-models-on-hugging-face-platform)
- [Hugging Face raises $235M from investors](https://techcrunch.com/2023/08/24/hugging-face-raises-235m-from-investors-including-salesforce-and-nvidia/)
- [BigScience and Hugging Face's AI Language](https://datasciencelearningcenter.substack.com/p/hugging-face-and-bigsciences-ai-language)

## User

what would be the command to run this

## Assistant

Based on the screenshot, the command to launch the model server is:

```
llama-server -hf ggml-org/gemma-4-26b-a4b-it-GGUF:Q4_K_M
```

And then to onboard it with OpenClaw:

```
openclaw onboard --non-interactive \
  --auth-choice custom-api-key \
  --custom-base-url "http://127.0.0.1:8080/v1" \
  --custom-model-id "ggml-org-gemma-4-26b-a4b-gguf" \
  --custom-api-key "llama.cpp" \
  --secret-input-mode plaintext \
  --custom-compatibility openai \
  --accept-risk
```

The first command starts a local llama.cpp server serving the Gemma 4 26B model (quantized to Q4_K_M), and the second registers it with OpenClaw so you can interact with it through that interface. You'd run the `llama-server` command first, wait for it to be ready, then run the `openclaw onboard` command in a separate terminal.
