I have about 2 YoE, and I’m sure this changes with more experience.

I often hear this idea online that programmers should follow “just-in-time” learning, meaning you should prefer to learn things you don’t know while on the job. ( The way some people talk about it, though, it sounds like you shouldn’t dare spend a single minute learning anything new outside of your 9-5. )

This seems generally reasonable advice, especially for simpler things that take a few hours like learning a specific language feature, library, or similar. But when I lean too much on this JIT learning, it feels possibly detrimental.

Many times I do something big and new to me, say, deciding how to approach auth, microservice architecture design, automated testing, containerization, etc., I end up making a big decision after a few hours or days of cursory reading on documentation and blogs, only to come to regret it some months later. At that point, maybe I’ll slow down, find a book on the subject, read it, and think, “Oh, darn, I wish I knew that N months ago.” It certainly feels like spending more time learning upfront could have avoided mistakes due to lack of knowledge. Though there’s no way to go back in time and know for sure.

I’m not asking about any area listed in particular. I feel like, for all of those, I’ve learned more in the time since, and would probably avoid some of my prior mistakes if I did it again. The question is more: How much do you subscribe to this idea of just-in-time learning? And if you do, how do you know when you’ve learned enough to be confident, or when you need to slow down and learn in more depth?

8 points

I feel like I have been doing this all my life. I think it’s more to do with the depth of understanding too. But the environment has to support it, if there is an expectation that everyone is an expert from day one, and there is no room for self improvement then it can’t be done.

As stated there are down falls with the approach such as lack of exposure to new ideas. You still need to look just not study. But to me it’s also a work/life balance policy. But don’t practice it in extreme as it can hold you back. Good work places should allow for some learning time and I’m hoping that gets normalized.

permalink
report
reply
32 points
*

In my opinion the best developers are “generalists” who know a little bit about everything. For example I have never written a single line of code in the Rust programming language… but I know at a high level all of the major pros and cons of the language. And if I ever face a problem where I need some of those pros/don’t care about the cons then I will learn Rust and start using it for the first time.

There’s not much benefit to diving deep into specialised knowledge on any particular technology because chances are you will live your entire life without ever actually needing that knowledge and if anything, it might encourage you to force a square peg into a round hole - for example “I know how to do this with X, so I’m going to use X even though Y would be a better choice”.

Wikipedia has a list of “notable” programming languages, with 49 languages just under “A” alone and I’ve personally learned and used three of the “A” languages. I dislike all three, and I seriously hope I never use any of them ever again… but at the same time they were the best choice for the task I was trying to achieve and I would still use those languages if I was faced with the same situation again.

That’s nowhere near a complete list - which would probably have a few thousand under “A” alone. I know one more “A” language which didn’t make Wikipedia’s cut.

The reality is you don’t know what technology you need to learn until you actually need it. Even if you know something that could be used to solve a problem, you should not automatically choose that path. Always consider if some other tech would be a better choice.

Since you’re just starting out, I do recommend branching outside your comfort zone and experimenting with things you’ve never done before. But don’t waste time going super deep - just cover the basics and then move on. If there’s a company you really want to work for, and they’re seeking skills you don’t have… then maybe get those skills. But it’s risky - the company might not hire you. Personally I would take a different approach, try to get a different job at the company first then after you’ve got that, start studying and ask your manager if they can help you transfer over to the job you previously weren’t qualified for, but are qualified now. In a well run company they will support you in that.

As for learning outside of your 9-5… you should spend your spare time doing whatever you want. If you really want to spend your evenings and weekends writing code then go ahead and do that… but honestly, I think it’s more healthy long term to spend that time away from a desk and away from computers. I think it would be more productive, long term, to spend that time learning how to cook awesome food or do woodworking or play social football or play music… or of course the big one, find a partner, have kids, spend as much time with them as you can. As much as I love writing code, I love my kid more. A thousand times more.

Programming is a job. It’s a fun job, but it’s still a job. Don’t let it be your entire life.

permalink
report
reply
2 points

Look Ma, this guy says it’s ok that I’m a full stack dev. He says it’s even good!

Also: counterpoint: if you teach your kids to code, you can outsource to them.

permalink
report
parent
reply
3 points

Great advice, you two. Have a bunch of kids and teach them APL, Actionscript, and Autohotkey. On it!

:)

permalink
report
parent
reply
2 points
*

this guy says it’s ok that I’m a full stack dev

I’m also a full stack dev - so maybe I’m biased. But I’ll add that there’s definitely a place for specialist work, but I don’t agree, at all, with people who think specialist developers are better than full stack ones.

The way I see it full stack devs either:

  • are good enough to be the only type of developer your hire; or
  • sit in between specialists and management

Take OpenAI for example. They have a bunch of really smart people working on the algorithm - so much so they’re not even engineers they’re more like scientists, but they also have a full stack team who take that work and turn it into something users can actually interact with, and above the full stack team is the management team. Maybe OpenAI isn’t structured that way, but that’s how I’d structure it.

But most software isn’t like ChatGPT. Most software isn’t bleeding edge technology - you can usually take open source libraries (or license proprietary ones). Let the specialists figure out how to make TCP/IP handle complex issues like buffer bloat… all the full stack dev needs to know is response = fetch(url).

permalink
report
parent
reply
1 point
*
Deleted by creator
permalink
report
reply
4 points
*

For some non-critical stuff you can experiment til you find something that appears to work, deploy it, and fix any issues that might appear. Too much of today’s Internet is done that way, but it really is ok sometimes

For critical work, you can easily apply the same approach but replace the “deploy it” stage with “do extensive internal testing”. It takes a longer and is more expensive, but it does work. For example the first ever hydrogen powered aircraft flew in 1957, was an airplane with three engines and only one of those three ran on Hydrgoen. Almost 70 years of engineering later and that’s still the approach being used. Airbus claims they will have commercial hydrogen powered flights around 2035 and plan to flight test the final production engine next year on an A380 Aircraft.

The A380 has four engines and each is powerful enough to fly safely with only one engine running. In fact, it should be able to land with four engine failures - with a “Ram Air Turbine” providing electricity and hydraulic pressure to critical systems.

The best approach to critical systems is not to build a perfectly reliable system, but rather to have redundancy so that failures will not result in a “please explain” before congress.

permalink
report
parent
reply
3 points

It’s a bit more complicated when security is involved. I deleted that post because it didn’t seem responsive enough to OP’s question but basically there is a big difference between stuff going wrong randomly (Murphy’s law) and smart determined adversaries trying to mess with you on purpose. Testing helps more with the former.

permalink
report
parent
reply
4 points

Sure — security is one area where you do need to be a specialist.

I’d say it’s the exception that proves the rule though. Don’t write your own encryption algorithms, don’t invent new auth flows, do hire third parties to audit and test your security systems, etc etc. If you want to specialise in something like security, then yeah that’s something you should study. But at the same time - every programmer should have general knowledge in that area. Enough to know when it’s OK to write your own security code and when you need to be outsourcing it.

permalink
report
parent
reply
5 points

Don’t know if my experience can be related by other people here. I’m a mathematician and I work in insurance writing python/SQL to transform data into knowledge. Everything python/SQL related I’m learning it on paid hours, like testing libraries or others. While outside my job I’m working on an actuarial sciences MBA, where I’m learning theoretical knowledge that couldn’t be learned “on the job”. When something I learn on the MBA can be used on the job, I rush to learn how to apply it on python (while being paid for it). For example, a couple of weeks ago we learned how to find the probability distribution parameters of a sample using maximum likelihood estimation, and while on the clock I learned how to that using scipy on the claims database.

I’m already thinking on doing a computer science masters after finishing this MBA, because I’m really enjoying the opportunity of studying theoretical stuff on my time while being paid to practice it and learn how to apply it in real life applications.

permalink
report
reply
4 points

My thoughts are spending too much time on the computer is bad for you, and once you find a good long term job, you dont need to learn so much. Youre 2 years into your career, hopefully you’ll find a good fit.

Knowing how to Hello World in ADA is not very helpful, try to find in demand things to learn. There really isnt that much.

Well, that, and programming itself is starting to slow down.

permalink
report
reply

Programming

!programming@programming.dev

Create post

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person’s post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you’re posting long videos try to add in some form of tldr for those who don’t want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



Community stats

  • 3.5K

    Monthly active users

  • 1.6K

    Posts

  • 26K

    Comments