This would be a perfect fit for a LLM (albeit a bit complicated by byte-pair tokenization and decoder models rather than bidirectional). You can clamp the grille words, and just generate samples with those as requirements until they reach a good likelihood and are highly fluent and indistinguishable from non-message carrying text.
This would be a perfect fit for a LLM (albeit a bit complicated by byte-pair tokenization and decoder models rather than bidirectional). You can clamp the grille words, and just generate samples with those as requirements until they reach a good likelihood and are highly fluent and indistinguishable from non-message carrying text.
That reminds me of this suckerpinch/tom7 video: https://www.youtube.com/watch?v=Y65FRxE7uMc
> (albeit a bit complicated by byte-pair tokenization and decoder models rather than bidirectional)
You can clamp tokens (instead of letters / words) in the grille, I guess?
Even more impressive is the lecture of the life of Cardano himself.
“Cars…
Cars on…
Carson City.” -John Cusack in Con Air