AI: damned if you do, damned if you don't
Paul Popus
I want to be clear that things move so fast in tech that these opinions could end up being outdated by the end of the year.
Agent tools, and mostly Claude let's be honest, has overtaken the actual work that we do and a certain magic and personal satisfaction has been lost. I don't want to sound ungrateful, because on one hand I believe long term this is good for our profession and will end up growing it.
Yet, the goal posts have been shifted so far our job has actually changed and it doesn't matter where in the stack you work, you're not protected by any sort of hard moat AI won't eventually clear.
It first started with just being good for references only, then slowly backend code and regular snippets became easy and this year we see AI completely own the development lifecycle. I spend most of my time now working in specs for AI and guiding the code design itself, not building the features and it's impossible to go back.
Damned if you do use it
writing beautiful code felt like an art sometimes [...] And yet to wish for that to come back feels selfish and hypocritical
The side of our job that involved regular problem solving, being creative and spending gruelling hours, sometimes days, solving hard and new problems while writing beautiful code felt like an art sometimes.
And yet to wish for that to come back feels selfish and hypocritical. We've spent decades as an industry improving our tooling, writing libraries to DRY-out regular problems all the way to full frameworks and now we have the ultimate tool and you're almost hesitant to claim it.
Damned if you don't
I think it took some introspection early on in my part to really admit that I have to let go this go and evolve with the technology. I will never be able to write, read or understand code as fast as Claude or other LLMs can; and that's right now.
We don't need to predict where this is going, we just have to remain adaptive to the incoming changes and find a way to use in a way that's still gives us satisfaction. I can now deliver features that previously would've taken weeks or months, in fact you almost feel bad being the bottleneck sometimes as sign-off is still largely human.
I think a lot of the clash is coming from a disconnect in some engineers not understanding that you're not paid to write artful code or spend days sweating over a problem, you are paid to deliver a solution no matter how that happens. You are also missing an opportunity to learn new things.
If you're starting now
eventually culture will morph for it to be weird if you can't [use AI]
Get comfortable with learning with AI, and to be honest I don't have a clear process here yet. It's something we will have to figure out, especially in schools, but use the AI to drill down into the details, question everything with Why? and ask for proof.
I don't think any job will hire a person who can't use AI sooner than we'd think, and eventually culture will morph for it to be weird if you can't just like it's weird now if you can't use a computer.
If you're a company
Business fundamentals remain unchanged, invest in the next generation and prioritise data and stack ownership. Experiment with open source, there's so many amazing projects tackling various problems around the use of AI and it's never been easier to own your own data.
We've built and shipped a few internal tools at Payload because off the shelf solutions didn't exactly solve our problem but also because we can. It takes less than a day now to spin up a prototype for what you need, play with your ideas and solve the exact problem you have, don't settle for 80%.
I've also not acknowledged the very real concerns around AI usage such energy and water usage, the chip manufacturing industry, the total annihilation of the concept of copyright and concentration of power into central models; I have thoughts on those too for another time.
Signed,
Paul Popus