1/
Embeddings are the beating heart of modern AI—powering RAG and serving as memory for agentic AI. But a new paper [1] shows a ceiling:
1. Dot-product retrieval is bounded by embedding dimension d; if the relevance matrix has sign-rank r, then d >= r is required—and no amount of training can avoid it.
I will be too busy to post tomorrow.
2012: O2 offers free wifi to multitudes, which I only now realize may be have been referenced in Kingsman, researchers determine that despite a century having passed, the Titanic remains at the bottom of the Atlantic, and in a glorious celebration of the effectiveness of the modern British educational system, doctors warn Britons not to drink liquid nitrogen.
✓ for read, * for intend to read, ! for never heard of it. Or whatever amuses you.