view post Post 12083 Reply I can't believe this... Phi-3.5-mini (3.8B) running in-browser at ~90 tokens/second on WebGPU w/ Transformers.js and ONNX Runtime Web! 🤯 Since everything runs 100% locally, no messages are sent to a server — a huge win for privacy!- 🤗 Demo: webml-community/phi-3.5-webgpu- 🧑💻 Source code: https://github.com/huggingface/transformers.js-examples/tree/main/phi-3.5-webgpu
view post Post 14066 Reply I'm excited to announce that Transformers.js V3 is finally available on NPM! 🔥 State-of-the-art Machine Learning for the web, now with WebGPU support! 🤯⚡️Install it from NPM with:𝚗𝚙𝚖 𝚒 @𝚑𝚞𝚐𝚐𝚒𝚗𝚐𝚏𝚊𝚌𝚎/𝚝𝚛𝚊𝚗𝚜𝚏𝚘𝚛𝚖𝚎𝚛𝚜or via CDN, for example: https://v2.scrimba.com/s0lmm0qh1qSegment Anything demo: webml-community/segment-anything-webgpu
Transformers.js demos A collection of my favorite WebML demos, built with Transformers.js! Running 916 🎤 Whisper Web Running 507 🖼️ Remove Background Web In-browser background removal Running 441 📝 The Tokenizer Playground Experiment with and compare different tokenizers Running 257 🖼️ Depth Anything Web
Xenova/distilbert-base-uncased-finetuned-sst-2-english Text Classification • Updated 19 days ago • 10.4k • 4