To be fair, most never could. I’ve been hiring junior devs for decades now, and all the ones straight out of university barely had any coding skills .
Its why I stopped looking at where they studied, I always first check their hobbies. if one of the hobbies is something nerdy and useless, tinkering with a raspberry or something, that indicates to me it’s someone who loves coding and probably is already reasonably good at it
Nevermind how cybersecurity is a niche field that can vary by use case and environment.
At some level, you’ll need to learn the security system of your company (or the lack there of) and the tools used by your department.
There is no class you can take that’s going to give you more than broad theory.
I am not a professional coder, just a hobbyist, but I am increasingly digging into Cybersecurity concepts.
And even as an “amature Cybersecurity” person, everything about what you describe, and LLM coders, terrifies me, because that shit is never going to have any proper security methodology implemented.
Im in uni learning to code right now but since I’m a boomer i only spin up oligarch bots every once in a while to check for an issue that I would have to ask the teacher. It’s far more important for me to understand fundies than it is to get a working program. But that is only because ive gotten good at many other skills and realize that fundies are fundamental for a reason.
This isn’t a new thing. Dilution of “programmer” and “computer” education has been going on for a long time. Everyone with an IT certificate is an engineer th se days.
For millennials, a “dev” was pretty much anyone with reasonable intelligence who wanted to write code - it is actually very easy to learn the basics and fake your way into it with no formal education. Now we are even moving on from that to where a “dev” is anyone who can use an AI. “Prompt Engineering.”
I could have been a junior dev that could code. I learned to do it before ChatGPT. I just never got the job.