Daily

All links of one day in a single page.

May 6, 2023

Nick Cave - Why the fuck are you going to the King's coronation?

So, with all that in mind, I am looking forward to going the Coronation. I think I’ll wear a suit. You can ask me anything. There will be no moderator. This will be between you and me. Let's see what happens. Much love, Nick

Stutterer - 2016 Oscar-Winning Short

2016 Oscar-Winning Short: “Stutterer” | The Screening Room | The New Yorker

What is a Vector Database?

Good basic explainer on vector databases.

Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

MosaicBERT: Pretraining BERT from Scratch for $20

With the MosaicBERT architecture + training recipe, you can now pretrain a competitive BERT-Base model from scratch on the MosaicML platform for $20. We’ve released the pretraining and finetuning code, as well as the pretrained weights.

Bank failures / Mike Bostock | Observable

Note: Only shows FDIC-reported bank failures; does not include investment banks and non-U.S. banks. Data: FDIC, FRED