Flabubium
Project ELA
A Flabubium
research tool

Ela

Any paper.
Fully understood.

Drop in a research paper. Ela finds every concept it expects you to know and builds a course to fill the gaps. Visual, interactive, and built for that paper specifically.

Proof of Concept
01
The Problem

Reading research is hard work.

Research papers do not stay in one field. FlashAttention assumes you know GPU memory architecture. AlphaFold 2 assumes protein chemistry, multiple sequence alignment, and transformer internals. None of it is explained in the paper. It is just expected.

So reading a paper you care about takes weeks of detours through lectures, textbooks, and blog posts, before you even reach the ideas you came for.

02
What already exists

Summaries are not the same as understanding.

The tools people reach for today either tell you what a paper says or explain concepts in the abstract. Neither gets you to the point where you can think about the research.

Text summarizers
ChatGPT, NotebookLM

They tell you what the paper says. They do not help you understand the concepts it is built on unless explicitly asked for it. Even then, we gfaced they lack structure that would help the reader get intuition and effectively learn.

Generic courses
YouTube, Coursera, Textbooks

Can't lie, you can find some excellent resources on YouTube and Coursera. These sites can help you learn the basics and then take you all the way to advanced levels. But, they usually don't cover topics that mix different fields unless those topics are popular. We aim to address this.

Ela
Paper-specific and visual

Ela reads the actual paper, finds every concept it assumes you know, and builds a course around exactly those gaps. With diagrams and interactive modules built for this paper.

03
The Product

How Ela works.

i.

Drop in a paper

Paste an arXiv link or upload a PDF. Ela reads the full text, figures, and appendices.

ii.

Ela maps the gaps

It finds every concept the paper expects you to know, traces their dependencies, and identifies what is missing from your background.

iii.

A course, built for this paper

A sequence of visual modules from first principles to the paper's ideas. Nothing generic. Nothing you do not need.

04
See it in action

Two papers, fully mapped.

Machine Learning Systems
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
arXiv:2205.14135
VenueNeurIPS
Year2022
Citations4,800+

We propose FlashAttention, an IO-aware exact attention algorithm that uses tiling to reduce the number of memory reads/writes between GPU high bandwidth memory (HBM) and on-chip SRAM. We prove that FlashAttention requires O(N²d/M) HBM reads/writes versus Θ(Nd + N²) for standard attention, where N is sequence length, d is head dimension, and M is SRAM size.

Early access

Get early access
to Ela.

We are building Ela now. Leave your email and we will reach out when it is ready.