Specifically thinking of stuff that make your life better in the long run but all kinds of answers are welcome!
I’ve recently learnt about lifetraps and it’s made a huge positive impact on how I view myself and my relationships
Is this also true outside America? You know, the kinds of places with unions, labor rights and laws that actually favor the employee?
Unions are all workers friend, but they are not your advocate. If your salary is up to the agreed national contract and there is little they can do.
it depends on the country, and where exactly you work, but in many countries (ehem Italia) they are somewhat too comfortable with the company management to be effective at their job.
It is still true, at least in Europe. I mean, they’re not actually trying to destroy your life, you know, but they’re after the company’s best interests. They might help you, and might make things not the worst they possibly can, because that’ll give a bad rep, but they’re not your friend.