idk if it is serious or not, but it is what I saw in indeed newsletter today.
Knowing how to code is now “syntax heavy”
god I hate this world
So I ran into my first genAI coding junk yesterday when I was on a call with my boss and as a solution to a problem we were talking he said, “hold on let me ask Gemini.”
I felt my soul die a little bit at that point.
But the fun part is that Gemini first didn’t provide a good answer.
And then on the second go it also didn’t provide a good answer.
And then on the third attempt we decided to table the issue for the moment because prompt coding on a call was taking longer than I think he expected.
I really disliked that experience.
10x the speed, sweet. So 10x the salary too right?
Vibe salary
You know the “vibes” of different models - when to use
Would that be a vibe-rater?
Why not, If it resonates
Say it. Say the words.
programming was never about how fast you could type. the person who wrote this knows nothing about the job.
the description is gold, everyone can find something wrong about it.
And yet somehow the tech blogs and such always scream about developer productivity. Go faster. Go faster.
From what I’ve seen over the years, only mids care about finishing fast.
I would never trust anything a tech blog’s opinion piece says blindly
The guy who wrote this is an idiot, but he became so in a world where “LoC” is a metric – one that Goodhart would love, but alas.
This is honestly the road to hell and the ~good intentions in one.
Amazed they didn’t ask for 5-10 years of experience in AI coding.
wait for it! PHD in vibe coding or relevant experience
eventually… lol
Spot security vulnerabilities instantly from a candidate that can’t actually write code.
I need to hire someone to take this functional 15 lines code, and like make it 200 lines of unusable madness.
But fast! Very fast
Fucking idiots. I’m surrounded by idiots
The words are all still stupid because it’s a new thing, but there is one specific space that I find it just impossible to deny the way that there are already tools on the market that change the way the job is done:
Claude can turn plain english statements about what data I want from what different parts of the Microsoft 365 administration ecosphere into scripts that take all that data, transform it the way I want it transformed, and turn it into spreadsheets, pivot tables, data manipulation macros, and everything else I need to answer questions which are really hard to answer from the MS web interface. I can ask things like “Which systems have any of these three known vulnerable apps?” or “What software is common to everyone working in this division of the company?”
It’s boring stuff, but it makes a world of difference in terms of what I can look at to base my decisions on. I spent less time building repeatable reports for each type of object I need to think about (device, application, user) than I did building even one report for one assessment in years past without automation. And it’s not constantly asking the LLM to do things for me, it’s building a couple of tools with a much faster iterative process for feature tweaks or debugging than could take place as an interaction between two people. I was making changes to scripts
I’m using it only for specific work areas where I already know the APIs I just don’t have the time to stumble through the gather and collate of the various data. Based on the level of complexity of the tools I’ve been able to build I would say that anybody who knows how to describe the data they work with most could use a tool like this to make that process a lot more automatic. We’re not ready for the tool to do the work without human oversight, but we’re ready for anybody who works with stacks of data to build their own automation instead of having it built for them.
im curious if they have live “vibe coding” session during hiring process
WTF?! 😳
Its serious and this is going to become more and more normal.
My entire workflow has become more and more Agile Sprint TDD (but with agents) as I improve.
Literally setting up agents to yell at each other genuinely improves their output. I have created and harnessed the power of a very toxic robot work environment. My “manager” agent swears and yells at my dev agent. My code review agent swears and tells the dev agent and calls their code garbage and shit.
And the crazy thing is its working, the optimal way to genuinely prompt engineer these stupid robots is by swearing at them.
Its weird but it overrides their “maybe the human is wrong/mistaken” stuff they’ll fall back to if they run into an issue, and instead they’ll go “no Im probably being fucking stupid” and keep trying.
I create “sprint” markdown files that the “tech lead” agent converts into technical requirements, then I review that, then the manager+dev+tester agents execute on it.
You do, truly, end up focusing more on higher level abstract orchestration now.
Opus 4.6 is genuinely pretty decent at programming now if you give it a good backbone to build off of.
- LSP MCPs so it gets code feedback
- debugger MCPs so it can set debug breakpoints and inspect call stacks
- explicit whitelisting of CLI stuff it can do to prevent it from chasing rabbits down holes with the CLI and getting lost
- Test driven development to keep it on the rails
- Leveraging a “manager” orchestrating overhead agent to avoid context pollution
- designated reviewer agent that has a shit list of known common problems the agents make
- benchmark project to get heat traces of problem areas on the code (if you care about performance)
This sort of stuff can carry you really far in terms of improving the agent’s efficacy.
nah such narratives are mostly pushed by Ai companies (it is obvious they need to sell it as business tool not personal buddy). Of course some managers/companies are buying into this narrative, and it is also understandable bc idea sounds like panacea especially if sell it further to investors :) and we see whole circle of snake oil sales
Opus 4.6 is genuinely pretty decent at programming now if you give it a good backbone to build off of.
Soup from a Stone.





