74 points
*

Oh I thought we just did Executive Orders these days. Interesting for Congress to want to be in the pocket of the three richest men in the United States, too.

permalink
report
reply
62 points

They’re gonna have a hard time making it illegal to download a completely open source software lol

permalink
report
reply
27 points

Do not tempt them to outlaw open source.

permalink
report
parent
reply
38 points

Facebook already tipped their hand by prematurely banning posts about Linux.

permalink
report
parent
reply
2 points
*

The most they could do is try to arbitrarily make the licenses null and void, but there’s no functional way to outlaw making code publicly available without also outlawing the entirety of HTML, CSS, and JavaScript.

permalink
report
parent
reply
13 points

Unfortunately, as I’ve learned recently, it doesn’t look like Deepseek is actually open source.

You can download the model, but unless I’m misunderstanding, that feels comparable to calling Photoshop open source because you can download the .exe file on your computer.

permalink
report
parent
reply
14 points

Its MIT licensed. Meaning the code is open but the license is permissible in that copy’s can be subsequently closed. This is unlike with the GPL most generally associated with open source code.

permalink
report
parent
reply
11 points

The weights are MIT licensed. The code is, too, but code for these things are uninteresting.

The training data is not open source, and that’s the interesting part of a model.

permalink
report
parent
reply
4 points

You can reweight as you please to whatever dataset you like. They can say what the training data included, but they can’t share the dataset.

permalink
report
parent
reply
4 points
5 points
*

This comment here seems to summarize it well: https://github.com/deepseek-ai/DeepSeek-V3/issues/457#issuecomment-2627016777

It’s more open-sourced than I thought, but also seems debatable. I don’t know enough about LLMs to properly judge. I would probably stay away from calling it “completely open-sourced” though.

permalink
report
parent
reply
55 points

Well now I’m just gonna download it even harder.

permalink
report
reply
4 points

I will download and upload it everywhere I can, if this gains traction.

permalink
report
parent
reply
14 points

I had zero interest in downloading this shit before because LLMs are just lying slop machines, but if it becomes illegal, I will go out of my way to.

permalink
report
reply
2 points

I’m with you on principle, but the thing is also like 600 GB of data. Not sure I have the disk space to take a stand on this one.

permalink
report
parent
reply

There are no lite versions? I was trying to find a small LLM version I can run on an old machine and take it off the internet (or just firewall it) and play around with it to see if there is anything worth learning there for me. I was looking at the lite version of llama but when I tried to run the install on mint I ran into some issues and then had to many drinks to focus on it so I went back to something else. Maybe next weekend. If you have any recommendations I’m all ears

permalink
report
parent
reply
1 point

There are finetunes of Llama, Qwen, etc., based on DeepSeek that implement the same pre-response thinking logic, but they are ultimately still the smaller models with some tuning. If you want to run locally and don’t have tens of thousands to throw at datacenter-scale GPUs, those are your best option, but they differ from what you’d get in the Deepseek app.

permalink
report
parent
reply
2 points

No worries. I’ve got some 20TB drives i can throw these on 😁

Now, running a model that large… Well, I’ll just have to stick with the 8-13b params.

permalink
report
parent
reply
1 point

you can actually run the model, but it just goes very slowly… i run the 70b model on my m1 mbp, and it technically “requires” 128gb or VRAM - it still runs, just not super fast (though i’d say it’s useable in this case a about 1 word per ~300ms)

permalink
report
parent
reply

I’d have to remove all my games, the operating system, and run my SSDs in RAID to be able to fit that and still generate things with it. 😮

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
18 points

These models are mostly giant tables of weights that run on standardized framework software, right?

We’re talking about making illegal numbers again, aren’t we?

09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0

permalink
report
reply
3 points

Were going to need a bigger shirt.

permalink
report
parent
reply
3 points

Its funny what happened with AACS, they didnt learn.

DeCSS was the same before that. They sure taught us, eh? 🖤🏴‍☠️🖤

permalink
report
parent
reply

United States | News & Politics

!usa@midwest.social

Create post

Welcome to !usa@midwest.social, where you can share and converse about the different things happening all over/about the United States.

If you’re interested in participating, please subscribe.

Rules

Be respectful and civil. No racism/bigotry/hateful speech.

Post anything related to the United States.

Community stats

  • 5.6K

    Monthly active users

  • 2.1K

    Posts

  • 16K

    Comments