Lets think specifically about intuition in the AI age. On the one hand, it’s a type of intelligence that is hard to achieve in AI. On the other hand, as we lean on AI and become more productive and efficient, is there a part of our intelligence that we stop nurturing? This development could lead to a sad convergence you might say - AI improving and we diminishing and meeting somewhere in the middle.
So are there actually important types of experience we cease to have with increasing automation? What of the experience of being deeply engaged with our work? What of this focused engagement and effort that compounds over years and becomes intuition - the I don’t know how I know but I know sort of knowing?
As a race, it is far from physical labour we have come. AI is the meta automation, automating those things already once or twice removed from physical experience. I look down - beneath my fingers the keyboard rests and I think of Heaney’s ‘Digging’;
Between my finger and my thumb The squat pen rests; snug as a gun. Under my window, a clean rasping sound When the spade sinks into gravelly ground: My father, digging. I look down
He goes on to say,
By God, the old man could handle a spade. Just like his old man.
This is poetry reflecting on the differences between pen and plough and generational drift. There is a reverence shown to these physical forms of labour as to the blacksmith in his poem ‘Forge’… Where he expands himself in shape and music.
If intuition slowly builds, it is also the waiting that automation steals. Who will sit quietly with the blank page like the blank page Ted Huges begins his poem with, before encountering a fox outside his window… until his poem ‘The Thought Fox’ ends,
Till, with a sudden sharp hot stink of fox
It enters the dark hole of the head.
The window is starless still; the clock ticks,
The page is printed. Waiting and boredom are ingredients of creativity and the slow building of intuition. One thing you experience in moments of boredom for example is the embodied experience of having your own thoughts. Strange? Boredom is lost to us now, replaced by endless scrolling - people feel the need to write books about it; The Power of Boredom: Why Boredom is Essential for Creating a Meaningful Life; How to Do Nothing: Resisting the Attention Economy. You could say boredom is the empty space where we find what T.S. Elliot in ‘Little Gidding’ said is Not known, because not looked for / But heard, half-heard, in the stillness / Between two waves of the sea.
It is difficult sometimes to see the difference between experience and knowledge. But one is deeply felt. Knowledge is like a map of a world that can otherwise be experienced. For many years I lived in London. One of the great things about London is how spread out but accessible it is on the London Underground, The Tube. You can spend years in your routine traveling between regions underground. Your map of London is the idealized tube map. Its only when you get out and walk between streets and towns that you build your own mental map of London. This can be the difference between using a city and falling in love with it. I think this is a subtle thing. But this deeper connection to a place is the same as building a map of it that is yours.
Perhaps less generally relatable and affecting me now is AI in software. Some folks are excited that you could take for example a Figma and produce an entire application with AI. Don’t get me wrong, I’m not challenging progress nor am I, a priori, unwilling to surrender up a type of work as redundant. I am not an application developer either so I also have no skin in this particular game. I just don’t want AI to take an input and produce (for me) an output that I don’t know how to build myself. I prefer a sort of hybrid approach where it takes me by the hand and guides me through an unfamiliar world, perhaps encouraging me to build something I would not otherwise have built.
This is how I use AI now, to help me make small leaps faster and create new experiences. In the example above it would be knowing how the application fits together, how to tweak it if I need to. In the London map analogy it becomes my guide, my map, allowing me to travel to new places, to go up and out and look around and gradually build, in my own mind, a complete map of the city. My city.
Intuition builds firstly in active and deep engagement with our work and also in the “negative space”, in the waiting that follows active engagement. There are many sorts of waiting and in-between moments including sleep in which we are learning by not doing. The accretive growth of these little learnings cannot always be expressed, in the end, as facts and algorithms. But that’s OK. John Keats coined “negative capability” as being capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason. We sometimes think this only holds for artists, but not so. This type of knowing allows us to hold things in our heads that are not yet resolved but are nonetheless vital experience. And we can reach deeper understanding in this way. Here T.S Elliot captures the insight about arriving at understanding well;
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time. But intuition is under attack. On multiple fronts. Automation dissociates us and boredom is forbidden to us, either due to bombardment from the information economy or in our recital of the mantra time is money. We cannot afford to be bored. Our increasingly bureaucratic society cannot afford intuition. But not just this, our hyper-rational world is trying to teach us to devalue intuition. The popular book Thinking Fast and Slow, which shares years of groundbreaking research by Kahneman and Tversky, almost seems to say we shouldn’t trust it, while recognizing its value to experts. In Intelligence of the Flesh, Guy Claxton needs to remind us that the dichotomy handed down to us in the scientific tradition, the one between - the rational and useful mind and the irrational and misleading emotion - is itself misleading. Part of the intuition “in the mind” is deeply rooted “in the flesh”.
Intuition takes time to mature. It is our experience compounded. If you make the investment you become the virtuoso of your craft - eventually. If we devalue it, it is all too easy to argue that AI replacing us in our tasks comes at zero cost. Who wants to play the long game anyway? The one where the payoff may only be seen in slightly gray hair. Would you not rather at least seem like an expert right now?
If we devalue intuition culturally or in science, it both reflects on and is a reflection of the atrophy of intuition within us. But I think intuition is in fact part of the hard problem in intelligence and not just important to us intrinsically as humans. Other types of intelligence seem more accessible to us as scientists and engineers; the large scale pattern matching of deep learning (including large language models) and the symbolic reasoning seen for example in AlphaGeometry. One might argue that from a bilateral brain perspective, symbolic reasoning is seated in the left and the pattern matching is seated in the right. Maybe the pattern matching of deep learning approaches something like intuition? Certainly as it is implemented today it lacks the powers of abstraction of the human mind.
Yes, there is something still missing in language models. And I think the easiest way to see this is to note that current models have no way to check themselves before they wreck themselves, leading to confabulation or what are commonly referred to (to my irritation) as hallucinations. I don’t think adding symbolic reasoning is the answer here - there seems to remain a distributed representation problem.
If AI models indeed had something like intuition I think it would look similar to how Jonas Salk portrays the role of reason below. (I took this quotation and also the context from a chapter called The Ultimate Demise of Intuition in The Matter With Things by Iain McGilchrist.)
Reason alone will not serve. Intuition alone can be improved by reason, but reason alone without intuition can easily lead the wrong way. They both are necessary. The way I like to put it is that when I have an intuition about something, I send it over to the reason department. Then after I’ve checked it out in the reason department I send it back to the intuition department to make sure that it’s still all right.
LLMs don’t seem to send things back to the intuition department.
Another reason why intuition may be devalued is it rarely gives you a plan. We all want plans. Plans are things that stretch out into the future promising some certainty. You can write them down. Intuition, on the other hand, seems to exist only in the moment, providing an instinctive next move. And you must iterate and play out these moves over time - how tedious. This makes sense, since intuition is learned in embodied moments. It is a muscle. A reflex. Each such reflex is a micro decision and intuition is a muscle that is trained over time and becomes that thing that helps you make decisions, reflexively. Maybe you don’t want to outsource that ability to a machine.
While we should always be building intuition, there is a time and a place to use it. Consider the well-known skills model of brothers Dreyfus and Dreyfus. It begins with the novice using explicit rules that they follow without any context. The advanced beginner then notices some patterns and appreciates context - where the rules admit some adaptation. They then become competent and can deal with situational context. Now intuition is starting to form but they may still need their rules and algorithms. They then become proficient, looking holistically at situations and contexts. Intuition and reasoning via rules now meld together when making decisions. Finally they become experts and make judgements so intuitively that it can be difficult to articulate how judgements are reached. I asked ChatGPT what it thought about this model wrt. intuition and in my opinion it gets it wrong (emphasis theirs);
… the reliability of intuition depends on the quality of the learner's experience. Intuition is most effective in domains that are stable and predictable, where patterns can be reliably learned (e.g., medicine, chess, firefighting). In chaotic or unpredictable domains, even experts can make mistakes when relying solely on intuition.
I think this is absolutely wrong! The opposite is true. Rules and algorithms are best in stable and predictable situations and intuition is valuable in chaotic and unpredictable environments. AI may want you to believe this is not this case while it develops some intuition of its own. I suggest you reflect on it a little bit for yourself. Perhaps, where you end up depends on your own experience, the nature of your work, etc.
The counterargument to my ‘danger of detachment’ argument is that we can, well, get our kicks elsewhere. As civilization progresses, fewer people need to know how a broad amount of stuff works. Which suggests, having bought back time, they can go deeper into understanding how different stuff works and find new and maybe even richer embodied experiences. Maybe we can all return to nature. Maybe we can again pick up the shovels of Heaney’s ancestors whence we say By God my child can handle a spade / just like their children. Or perhaps we go in new directions, tinkering with more complex technology and ideas. Programming languages have progressed from the raw assembly language to higher level languages. We gave up the low-level details without remorse and the programmer can gradually forget how computers actually work. Prompting AIs is simply this history continued.
I think this world of AI still works to our benefit if we keep embodied time as an ingredient in everything we do. This means thinking about processes and not just outcomes. It means maintaining rich dialogues with our tools. In these tête-à-têtes we experience making micro-decisions in each exchange. Decision making is the muscle building, the antidote to atrophy. Never take the easy path. Make as many small decisions in your work as you can. And leave some space. Some waiting. Maybe even a little boredom.
In the end I come back to Heaney’s ‘Forge’ because it is not just a celebration of physical craft and labour but a metaphor for his creative process, for our creative process. Labour. We ourselves are forged in the fire of it. There is the appreciation of the slow tempering, the folded processes, of time as ingredient… through which we produce our creations, changing ourselves in the process.
I fear The unpredictable fantail of sparks / Or hiss when a new shoe toughens in water is a satisfaction we’ll increasingly cease to feel in knowledge work, particularly that which is automated by AI. But not if we continue to adapt our processes around it with careful attention to nourishing our intuition. Our intuition.
Tell me what you think - I’d ❤️ to know.


