20 Mar 2023 |
h4pZ | Yeah, more or less. However, that’s if you wanna do like the whole diffusion, huge LLM, VLMs stuff. If you just want to do the “””normal””” stuff like object classification/image classification, you should be fine | 17:58:12 |
h4pZ | Just anecdotal experience of my trying to run a huge diffusion model with my 4080 with FP32 hehe | 17:58:34 |
h4pZ | * Just anecdotal evidenceof my trying to run a huge diffusion model with my 4080 with FP32 hehe | 17:58:46 |
h4pZ | * Just anecdotal evidence of my trying to run a huge diffusion model with my 4080 with FP32 hehe | 17:58:51 |
h4pZ | * Just anecdotal evidence of me trying to run a huge diffusion model with my 4080 with FP32 hehe | 17:59:06 |
APCodes#2552 | yeah I guess I'll go for the 3060 and try to get one for 250 Euros. I can potentially resell or reuse it and get a used 3090 in a year or so if I feel the need for that. | 17:59:16 |
h4pZ | That would be a good idea 24GB of vram can make the difference | 18:00:12 |
APCodes#2552 | is it possible to easily use two 3060 together? That would also give me 24 GB of VRAM | 18:00:53 |
APCodes#2552 | and would be cheaper I suppose | 18:01:25 |
h4pZ | Haven’t really tried it tbh | 18:01:59 |
h4pZ | I know there are a lot of tools that let’s you distribute training on multiple gpus | 18:03:18 |
h4pZ | Via data parallelism | 18:03:23 |
APCodes#2552 | okay interesting, I though as much. In that case I assume the stuff might also run in two different machines, yeah? | 18:03:23 |
APCodes#2552 | like it would be the case in many cloud applications anyways | 18:03:26 |
APCodes#2552 | yeah: "DistributedDataParallel (DDP) implements data parallelism at the module level which can run across multiple machines." | 18:06:07 |
APCodes#2552 | I just answered that myself I guess 😄 | 18:06:14 |
h4pZ | Yes | 18:09:19 |
h4pZ | Yes you did :) | 18:09:33 |
APCodes#2552 | yeah that is great, should I really start to like Deep Learning. I will be building a NAS/VM host once the AMD Ryzen 5700G gets a little cheaper. And then I could put 3060s in there! 😄 | 18:12:19 |
APCodes#2552 | anyways, thanks 😅 | 18:13:28 |
h4pZ | Send pics of the NAS (if you want to) once it’s done ;) | 18:14:53 |
h4pZ | Np | 18:14:55 |
APCodes#2552 | yeah I will, but it might be a few months. The 5700G is still a bit too expensive for my taste. But I have a AM4 mainboard which is not in use, and a power supply, and that would be a perfect system for something like that... | 18:16:49 |
APCodes#2552 | and a case too | 18:16:58 |
APCodes#2552 | and SSDs. I actually only need the CPU and RAM | 18:17:15 |
Korven@NotGuix | lmao is that a shot 🤣 | 18:21:26 |
hash#5127 | Stop giving ideas to them | 18:29:47 |
hash#5127 | They’re listening 👂 | 18:29:49 |
h4pZ | They are https://twitter.com/nearcyan/status/1622816211003142144 | 19:59:50 |
| ikako joined the room. | 20:06:45 |