https://public.dhe.ibm.com/ibmdl/export/pub/software/server/ibm-ai/conda/#/

On the face of it, the ability to run models larger than GPU memory would seem to be extremely valuable. Why did they give up? Not everyone has an 80GB GPU.

Was the performance too slow?

No comments yet!

Machine Learning

!machinelearning@lemmy.ml

Create post

Community stats

  • 1

    Monthly active users

  • 50

    Posts

  • 0

    Comments

Community moderators