LLMs, Emergence, and Programming as Gardening

2023.08.07
A great summary of where we are with LLMs and how we got here.
(I love deconstructed power point presentations like this, so much more skimmable than the full on video)

I think the most interesting sentence is:
The fascinating thing is that capabilities of these models emerge at certain sizes and nobody knows why.
I help lead a "Science + Spirituality" group at a local UU church, and one term people who are looking for meaning in our physical world (that isn't bestowed from "outside the system") is "emergence". As systems connect, new behaviors show up that you couldn't have predicted by just looking at the lower levels; atomic physics becomes chemistry becomes biochemistry becomes neurochemistry becomes psychology, but you can't really do much psychology in terms of atoms. But we can find meaning in the emerged and shared experiences all humans go through.

And it's funny; I think one of the most important dichotomies in human understanding is holism vs reductionism. The psychiatrist Iain McGilchrist thinks that's rooted in the two-hemisphere model of the brain but is a split in approach that scales all the way up to the societal level; Robert M. Pirsig's "Zen and the Art of Motorcycle Maintenance" sees it as "classical" vs "romantic" thinking (and finds the resolution of where they meet in Taoism). And as a programmer, I think reductionism had been on the rise for the past few decades (for example - a focus on low level unit-level testing vs functional and integration) but that will be challenged as the industry integrates LLMs more and more into its workflow.

This also all reminds me of "A-Life", which was really big a while back - artificial life simulations, often where small rules were established and then allowed to run rampant and in parallel (Conway's "Game of Life" being the Ur- example of this) I took a class at Tufts' Experimental College in it. One thing my instructor Jeffrey Ventrella (his website ventrella.com has lots of cool stuff) said was that in the future, programming would look less like regular engineering and more like gardening. At the time I could only see that in terms of having a human be the selective, weeding force in evolutionary processes, but now it seems like a pretty good metaphor for the kind of "as much art as science" intuitive skill prompt engineering is right now, like the like I started with talks about; you sort of know how to get the results you want and have a basic idea of how to get there, but it's still full of surprises and you never know where an ugly weed of a hallucination is going to show up.