In 2014, Pietu1998 asked: why do we need both 0s and 1s? Why can't we pick one symbol and write all of code with that? His language, called Lenguage, showed that this is possible. I recently made it a centerpiece of my paper on non-programmable programming languages, Language Without Code. Named for the LEN() function, which returns the length of a string in a number of different languages, Lenguage looks at the total size of the program, regardless of its content. That single number is then broken down into individual brainfuck commands. This is done by splitting every three digits and translating:
000 to +
001 to -
etc (the rest of the mapping on the esolangs wiki)
If we take any program as series of 1s and 0s, we can conceptualize it as a single number: simply treat the entire program as a very long integer. In Lenguage, it’s not that number that makes up the program, but the size of that number. This makes the actual content of the program irrelevant; any piece can be swapped for other content, so long as the total length doesn’t change. Lenguage asks for all 1s, but this number could be expressed as a pile of stones, the length of a line, or anything else that’s countable.
While Lenguage is perfectly valid and well-formed, there are some practical issues with programming in it. Storing data at the atomic level, the program to write "Hello, World!" to the screen would take more data than exists in the known Universe. However, we can borrow from the brainfuck derivative Spoon, which Huffman encodes the brainfuck command set, creating a more concise expression of the language. Then Hello World could be expressed as a single solar mass black hole.
Here is the full content of that program: