Artikel geschreven door Elgar Weijtmans.
What programmers teach us about the future of the legal profession
Artikel geschreven door Elgar Weijtmans.
Mustafa Suleyman, CEO of Microsoft AI, gave an interview to the Financial Times last week. Suleyman is no lightweight: he co-founded DeepMind, the AI lab Google acquired in 2014, and has been at the forefront of AI development for over fifteen years.
His message was, to put it mildly, not subtle. According to Suleyman, most tasks performed by knowledge workers will be fully automated by AI within twelve to eighteen months.
“White-collar work where you’re sitting down at a computer, either being a lawyer or an accountant or a project manager or a marketing person. Most of those tasks will be fully automated by an AI within the next twelve to eighteen months.”
Now, bold timeline predictions about A(G)I have a rich tradition of being spectacularly wrong. This one may well join that club. But whether it takes eighteen months or five years is not really the point.
What immediately struck me was that he mentioned lawyers first, before accountants and project managers. In last week’s Beyond Billable podcast episode, Sarah Wilson-Ward of Elevate described what could be seen as the practical version of Suleyman’s prediction: how lawyers are struggling with precisely that shift.
The parallel with software engineering
I am a former lawyer and software engineer. In lectures, I regularly cite a study into the professions that AI will affect most. The results are, for me at least, quite confronting: law and computer science emerge as the two most impacted sectors, with the third, business and financial operations, trailing at a considerable distance. Double bad luck, in my case.
That combination does give me an unusual vantage point on claims like these. What Suleyman describes confirms the picture that study paints. And he uses software engineering as a proxy for what will happen to other knowledge professions. Look at programmers, he argues, and you will see what lies ahead for lawyers. And what programmers do day to day is not as foreign as it might sound: you read a brief, break the problem into smaller pieces, write the solution in precise, structured language (code), and you keep on iterating until it is a successful solution to the problem.
In his words: “Many software engineers report that they are now using AI-assisted coding for the vast majority of their code production, which means that their role has shifted now to this meta-function of debugging, scrutinizing, of doing the strategic stuff like architecting.”
I recognise this. I now work this way myself. I provide a clear briefing, AI produces the first version, and I assess, adjust, and intervene where necessary. The balance between producing work yourself and reviewing output has shifted dramatically in a short space of time.
What this means for the legal profession
This parallel extends naturally to legal practice. The tasks that resemble “writing code” are obvious: drafting contracts, producing research memos, writing opinions, preparing court documents. The tasks that resemble “architecture and debugging” are equally recognisable: determining strategy, weighing risks, adding nuance, maintaining client relationships, asking the right probing questions, knowing when to deviate from the standard approach.
In the podcast, Sarah makes an interesting observation that ties in neatly here. She looks at legal work along two axes: is it core or context, and is it legal or operational? Think of learning to ride a bicycle: you can read all about it, study every technique, and still not be able to do it. That is core work: it requires real experience, judgement, and craftsmanship. Determining strategy, weighing risks, guiding a client through a difficult decision. No manual, and no AI (at least for now), can replace that.
What happens when you apply that lens to the daily work of lawyers? On closer inspection, a surprisingly large proportion of what ends up on their plate is not core legal work at all. It can be handled perfectly well with a good playbook.
The shift Suleyman describes, then, is not that lawyers are becoming redundant. It is that the work, and the skills it demands, are changing. Less production, more assessment. Less typing, more thinking. To get ahead of this properly, you need to understand how it translates to your own practice. And that is precisely the problem Sarah identifies: many lawyers are so caught up in firefighting mode that they lack the time and space to take stock of what they actually do, let alone figure out how AI fits into it.
The real work
AI is taking over an increasing share of knowledge work. This is already true for software engineers, and it will be true for lawyers. But not all legal work is created equal. Work that can be reduced to clear instructions and structured information can largely be handled by AI. In my view, that does not have to be a loss. It might actually be a blessing in disguise (what remains is the work that most lawyers got into the profession for in the first place).
What remains is the work that cannot be learned from a manual. Strategy, judgement, the moment when you decide to deviate from the standard. The cycling itself. Tomorrow’s top lawyer will not be the one who produces the thickest memos, but the one who excels at the meta-function surrounding core legal work: assessing, adjusting, providing direction. But first you need to know what your core is. And that starts with taking an honest look at what you actually do.