My advisor in grad school had me implement a "typo distance" metric on strings once (how many single-key displacements for a typist using home row touch-typing to get from string A to string B), which seemed kind of cool. I never did find out what if anything she wanted to use it for.
I remember the time when Python was the underdog and most of AI/ML code was written in the Matlab or Lua (torch). People would roll their eyes when you told them that you were doing deep learning with Python (theano).
That's a very tenuous analogy. Microcontrollers are circuits that are designed. LLMs are circuits that learned using vast amounts of data scraped from the internet, and pirated e-books[1][2][3].
Apple has an Apple Pay for Donations[1] program, which doesn't apply for rent seeking entities like Patreon. I wonder if Patreon's 10% fee is commensurate with the negligible value that they provide?
gpt-5.2-codex xhigh with OpenAI codex on the $20/month plan got to 1526 cycles with OP's prompt for me. Meanwhile claude code with Opus 4.5 on the team premium plan ($150/month) gave up with a bunch of contrived excuses at 3433 cycles.
[1]: https://www.intel.com/content/www/us/en/developer/articles/t...
reply