gwern 16 hours ago

This would be a perfect fit for a LLM (albeit a bit complicated by byte-pair tokenization and decoder models rather than bidirectional). You can clamp the grille words, and just generate samples with those as requirements until they reach a good likelihood and are highly fluent and indistinguishable from non-message carrying text.

  • eru 15 hours ago

    > (albeit a bit complicated by byte-pair tokenization and decoder models rather than bidirectional)

    You can clamp tokens (instead of letters / words) in the grille, I guess?

jhoechtl 13 hours ago

Even more impressive is the lecture of the life of Cardano himself.

janderson215 17 hours ago

“Cars…

Cars on…

Carson City.” -John Cusack in Con Air