Every day there’s more big job cuts at tech and games companies. I’ve not seen anything explaining why they all seam to be at once like this. Is it coincidence or is there something driving all the job cuts?

3 points

Consolidations, changing business direction, lack of funds and projects, etc.

permalink
report
reply
11 points

ChatGPT has been quoted as a cause in at least one of the layoffs. The tech industry is specially positioned to be quickly affected by AI, but AI is going to impact 80% of the jobs on the planet within the next 8 years. Our world is about to experience a massive change to the way things are run. We can try to prepare, but it’s going to change in ways previously unimaginable.

permalink
report
reply
6 points
*

I mean, I can imagine it. It’s the industrial revolution all over again but Cyberpunk style.

permalink
report
parent
reply
22 points

The chatbot that’s wrong 50% of the time? That’s hard to believe.

permalink
report
parent
reply
-5 points

It doesn’t need to be right to make money, often more money than companies get by paying people to do a job properly.

permalink
report
parent
reply
14 points

… Yes, it does in the tech sector. If you’re wrong it doesn’t work.

I’ve tried the tools out. You go from writing code for an hour and debugging for half an hour to writing code for 15 minutes and debugging for three hours.

Half the time you’ve ripped out literally every bit of code the AI wrote by the time you’re done making it work.

permalink
report
parent
reply
-4 points

You’re doing yourself a big disservice if you limit your understanding of AI to what you read from the opinions of Lemmings. It is incredibly powerful, and every major corporation has large investments in AI integration.

permalink
report
parent
reply
7 points

I’m not doing that at all, this is my personal experience.

permalink
report
parent
reply
2 points

And how many major corporations that invested in NFTs are still doing so?

permalink
report
parent
reply
3 points

I can’t wait to be starving

permalink
report
parent
reply
47 points

It’s an easy win for the balance sheet. Their products are still sellable, the services should be more or less unaffected (for the next few quarters), so they’ll continue to get the same revenue. But their costs just decreased, so they look more profitable.

It looks good on quarterly calls. It’s a good way to juice a stock.

permalink
report
reply
28 points

Short-term profits in exchange for long-term damage!

permalink
report
parent
reply
48 points

permalink
report
parent
reply
146 points

A few things happened pretty quickly.

During the pandemic, tech profits soared which led to massive hiring sprees. For all the press about layoffs at the big guys, I think most still have more workers than they did pre-pandemic.

Interests rates soared. Before the pandemic interest rates were ludicrously low, in other words it cost almost nothing to borrow money. This made it easier to spend on long term or unclear projects where the hope seemed to be “get enough users, then you can monetize.” Once interest rates rose, those became incredibly expensive projects, so funding is now much more scarce. Companies are pulling back on bigger projects or, like reddit, trying to monetize them faster. Startups are also finding it harder, so fewer jobs.

And of course, AI. No one is quite sure how much that’ll change the game but some folks think most programmers will be replaceable, or at least 1 programmer will be able to do the work of several. So, rather than hire and go through everything severance etc might entail, I think a lot of companies are taking a wait and see approach and thus not hiring.

permalink
report
reply
-34 points

1 programmer will be able to do the work of several

This is true right now. If you know how to use AI tools, it’s not that hard to work 5-10x faster as a programmer than it used to be. You still have to know what you’re doing, but a lot of the grunt work and typing that used to comprise the job is now basically gone.

I have no idea, but I can’t possibly imagine that that’s having no impact on resource allocation and hiring / firing decisions.

permalink
report
parent
reply
23 points

Lol AI ain’t that good, bud.

permalink
report
parent
reply
1 point

Want to have a programming contest where speed is a factor?

I actually looked this up, and the studies seem to agree with you. That one says a 55% increase in speed, and another says 126%.

All I can really say is, I’d agree with the statement that a single 3-hour task isn’t real representative of the actual overall speedup, and my experience has been that it can be a lot more than that. It can’t replace the human who needs to understand the code and what needs to happen and what’s going wrong when it’s not working, but depending on what you’re doing it can be a huge augmentation.

permalink
report
parent
reply
7 points
*

I love the speed-up. And I’m sure it factors into CEO and CIO decisions. But they’re on their way to learning, once again, that “code faster” never had anything to do with success or failure in efforts that require programmers.

Source: I sought great power, and I became one of the fastest coders, but it didn’t make my problems or my boss’s problems go away.

permalink
report
parent
reply
3 points
*

If you know how to use AI tools

Can you elaborate on this part? What’s your idea of proper usage?

permalink
report
parent
reply
1 point

So maybe I don’t know what I’m talking about. I will only share what I have experienced from using them. In particular I haven’t messed with Copilot very much after the upgrade to GPT-4, so maybe it’s a lot more capable now.

In my experience, Copilot does a pretty poor job at anything except writing short blocks of new code where the purpose is pretty obvious from context. That’s, honestly, not that helpful in a lot of scenarios, and it makes the flow of generating code needlessly awkward. And at least when I was messing with it there didn’t seem to be a way to explicitly hint to it “I need you to look at this interface and these other headers in order to write this code in the right way.” And, most crucially, it’s awkward to use it to modify or refactor existing blocks of code. It can do small easy stuff for you a little faster, but it doesn’t help with the big stuff or modifying existing code, where those are most of your work day.

To me, the most effective way to work with AI tools was to copy and paste back and forth from GPT-4 – give it exactly the headers it needs to look at, give it existing blocks of code and tell it to modify them, or have it generate blocks of boilerplate to certain specifications (“make tests for this code, make sure to test A/B/C types of situations”). Then it can do like 20-30 minutes’ worth of work in a couple of minutes. And critically you get to hold onto your mental stamina; you don’t have to dive into deep focus in order to go through a big block of code looking for things that use old-semantics and convert them to new-semantics. You can save your juice for big design decisions or task prioritization and let it do the grunt-work. It’s like power tools.

Again, this is simply my experience – I’ll admit that maybe there are better workflows that I’m just not familiar with. But to me it seemed like after the GPT-4 transition was when it actually became capable of absorbing relatively huge amounts of code and making new code to match with them, or making modifications of a pretty high level of complexity in a fraction of the time that a human needs to spend to do it.

permalink
report
parent
reply
1 point

Do you work in a technical role? I’ve dabbled in using AI to help out when working on projects, but I would say it’s hit or miss on actually helping, as in sometimes it helps me move a bit faster and sometimes it slows me down.

However, that’s just for the raw “let’s write some code part of the work”. Anything beyond that in my roles and responsibilities doesn’t really intersect with what AI can currently do, so I’m not sure where I would get a 5-10x speed-up from.

Honestly I’m not sure if I’m taking a wrong approach or if everyone else is blowing things out of proportion.

permalink
report
parent
reply
8 points

Are we great again yet?

permalink
report
parent
reply
12 points

I’m here to repeal and replace good things, and I’m all out of “replace”.

permalink
report
parent
reply
5 points

Let’s not throw the baby out with the bath water. AI had the potential to alleviate a lot of pressures of society, to free up much of our time spent doing tedious mindless tasks. We just need to make sure to use it for the benefit of the many rather than the profit of the few. I don’t want a union that wants to keep labor busy and well compensated, I want a union that keeps people safe, happy, and compensated properly

permalink
report
parent
reply
10 points

OMG I luv this:-) So, in your honor:

permalink
report
parent
reply
114 points
*

I want to offer my perspective on the AI thing from the point of view of a senior individual contributor at a larger company. Management loves the idea, but there will be a lot of developers fixing auto-generated code full of bad practices and mysterious bugs at any company that tries to lean on it instead of good devs. A large language model has no concept of good or bad, and it has no logic. It’ll happily generate string-templated SQL queries that are ripe for SQL injection. I’ve had to fix this myself. Things get even worse when you have to deal with a shit language like Bash that is absolutely full of God awful footguns. Sometimes you have to use that wretched piece of trash language, and the scripts generated are horrific. Remember that time when Steam on Linux was effectively running rm -rf /* on people’s systems? I’ve had to fix that same type of issue multiple times at my workplace.

I think LLMs will genuinely transform parts of the software industry, but I absolutely do not think they’re going to stand in for competent developers in the near future. Maybe they can help junior developers who don’t have a good grasp on syntax and patterns and such. I’ve personally felt no need to use them, since I spend about 95% of my time on architecture, testing, and documentation.

Now, do the higher-ups think the way that I do? Absolutely not. I’ve had senior management ask me about how I’m using AI tooling, and they always seem so disappointed when I explain why I personally don’t feel the need for it and what I feel its weaknesses are. Bossman sees it as a way to magically multiply IC efficiency for nothing, so I absolutely agree that it’s likely playing a part in at least some of these layoffs.

permalink
report
parent
reply
43 points

So basically, once again, management has no concept of the work and processes involved in creating/improving [thing], but still want to throw in the latest and greatest [buzzword/tech-of-the-day], and then are flabbergasted why their devs/engineers/people who actually do the work tell them it’s a bad idea.

permalink
report
parent
reply
15 points

I’m pretty excited about LLMs being force multipliers in our industry. GitHub’s Copilot has been pretty useful (at times). If I’m writing a little utility function and basically just write out the function signature, it’ll fill out the meat. Often makes little mistakes, but I just need to follow up with little tweaks and tests (that it’ll also often write).

It also seems to take context of my overall work at the time somehow and infers what I’ll do next occasionally, to my astonishment.

It’s absolutely not replacing me any time soon, but it sure can be helpful in saving me time and hassle.

permalink
report
parent
reply
10 points

Those little mistakes drove me nuts. By the end of my second day with copilot, I felt exhausted from looking at bad suggestions and then second guessing whether I was the idiot or copilot was. I just can’t. I’ll use ChatGPT for working through broad issues, catching arcane errors, explaining uncommented code, etc. but the only LLM whose code output doesn’t generally create a time cost for me is Cody.

permalink
report
parent
reply
13 points

A large language model has no concept of good or bad, and it has no logic.

Tragically, this seems to be the minority viewpoint - at least among CS students. A lot of my peers seem to have convinced themselves that the hallucination machines are intelligent… even when it vomits unsound garbage into their lap.

This is made worse by the fact that most of our work is simple and/or derivative enough for $MODEL to usually give the right answer, which reinforces the majority “thinking machine” viewpoint - while in reality, generating an implementation of & using only ~ and | is hardly an Earth-shattering accomplishment.

And yes, it screws them academically. It doesn’t take a genius to connect the dots when the professor who encourages Copilot use has a sub-50% test average.

permalink
report
parent
reply
1 point

In my experience copilot for neovim is pretty useful if you

  1. Split the current window if you have anything like type declarations in a separate file
  2. Write a pretty verbose documentation, e.g. using Swagger.

If you expect it to whip out of thin air what you really need and not have you correct it in several places, learn to code without it first.

permalink
report
parent
reply
2 points

a shit language like Bash

There’s your mistake, treating bash like a language and not like a scripting tool. Its strength is that it’s a common standard available on almost every machine because its older than most of us, its weakness is that it’s full of horribly outdated syntax because its older than most of us. If used to script other processes it’s great, but when you start using it as a language then the number of ways you can do something horrible that sort of works makes JavaScript look slick!

permalink
report
parent
reply
5 points

To add to this, at my company, we’ve received a mandate to avoid putting any code into an AI prompt because of privacy concerns. So effectively no one is using it here.

permalink
report
parent
reply
4 points

Yep as far as most companies should be concerned, using something like CoPilot means giving free license to Microsoft to all your trade secrets and code that you input.

permalink
report
parent
reply
1 point
*

We had the same. And you would have thought for a heavily regulated industry we’d keep it that way.

But no, some executive wonk from Microsoft flew over, gave our c-suite a “it’s safe, promise” chat over champagne and lobster, and now we’re happily using copilot.

permalink
report
parent
reply
8 points

I completely agree, although I think AI is more likely to have impact marketing, communications, PR, creative and PM type roles (and there are a lot of those in tech companies). I suspect we will see a noticeable reduction in tech workers over the next decade.

permalink
report
parent
reply
2 points

Interests rates soared. Before the pandemic interest rates were ludicrously low, in other words it cost almost nothing to borrow money. This made it easier to spend on long term or unclear projects where the hope seemed to be “get enough users, then you can monetize.” Once interest rates rose, those became incredibly expensive projects, so funding is now much more scarce. Companies are pulling back on bigger projects or, like reddit, trying to monetize them faster. Startups are also finding it harder, so fewer jobs.

Note that this also impacted other projects that take a lot of capital up front, then provide a return over a very long term. There was a nuclear power plant project with NuScale in Utah that got shelved over this; with interest rates suddenly going from way low to way high, the economics get upended.

I’d bet that in general, infrastructure spending dropped across the board.

permalink
report
parent
reply
25 points

One factor I haven’t seen mentioned is that because of rising interest rates, tech companies have had to shift from being focused on growth to actually turning a profit. Because of this, companies are having to shed employees because they over hired in anticipation of that continued growth. People are expensive so that’s an “easy” way to try to get the line closer to positive.

This is kind of a rough overview and I’m by no means an expert on economics. Just someone who works in tech and so has been following things closely.

permalink
report
reply
9 points

Michael Hudson, Jun. 2022: The Fed’s Austerity Program to Reduce Wages

To Wall Street and its backers, the solution to any price inflation is to reduce wages and public social spending. The orthodox way to do this is to push the economy into recession in order to reduce hiring. Rising unemployment will oblige labor to compete for jobs that pay less and less as the economy slows.

permalink
report
parent
reply
2 points

It also takes time to realize the costs of shedding workforce, and by then you might have a different CEO. As long as it’s next quarter, it’s fine.

permalink
report
parent
reply
1 point

This plus the changes to section 174 meaning R&D costs have to be written off over five years instead of all in the year they’re incurred. That’s hurting startups a lot and many have had to switch from building new stuff to licensing/selling their existing stuff, and firing some expensive engineers/developers, to be able to afford to stay open. https://www.axios.com/2024/01/20/taxes-irs-startups-section174

permalink
report
parent
reply

Asklemmy

!asklemmy@lemmy.ml

Create post

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it’s welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

Icon by @Double_A@discuss.tchncs.de

Community stats

  • 9.7K

    Monthly active users

  • 4.9K

    Posts

  • 275K

    Comments