GitHub - mlc-ai/web-llm: High-performance In-browser LLM Inference Engine

11/05/2024

Listen "GitHub - mlc-ai/web-llm: High-performance In-browser LLM Inference Engine"

Episode Synopsis

https://github.com/mlc-ai/web-llm

High-performance In-browser LLM Inference Engine . Contribute to mlc-ai/web-llm development by creating an account on GitHub.

More episodes of the podcast GitHub Daily Trend