Watch the video:
Timestamps:
0:00 - Intro/Explanation
0:40 - What is MPT-7
1:26 - 65,000+ tokens (Understanding pretty much a whole book!)
3:06 - Requirements
4:11 - One-line install command
7:53 - CPU only mode (for most users probably!)
8:32 - Fix crashes by lowering core count for Ooba
9:25 - Testing MPT-7b Chat
9:45 - Testing MPT-7b Storywriter (65k+ tokens!)
Want to try out the new MPT-7B models including the 65k+ token StoryWriter, Instruct and Chat models? Well, this video includes a simple one-line install command you can run now to try it out on your PC! These models have crazy requirements, but not something people can’t meet. You’ll either need a lot of RAM (~32GB+) or a 3090/4090 (16GB VRAM required). The one-click installer lets you choose between CPU and GPU modes for the model.
It’s a fun tech demo-type model that shows crazy things coming in the future!
MoscaicML: https://huggingface.co/mosaicml
More about MPT-7B: https://www.mosaicml.com/blog/mpt-7b
One-line install command: iex (irm mpt.tc.ht)