Sign up, and you'll be able to ignore users whose posts you don't want to see. Sign up
Dec 26, 2024
9:35:15am
AggieWeekendCougar All-American
If you have the cash, get an Amazon refurb 3090Ti. I have one at home and two
In my lab at work, all bought this way. I got them for between $899 and $1050. Right now they are $1100

Now some caveats. This is if you are doing a lot of training/fine-tuning of the model. This card has 24GB, which is rare for the consumer-grade cards. It also has the ampere engine with can do the mixed-fp16 computations, which allows you to fit bigger models and bigger data batches on the card.

Because the tutorial uses YOLO and not a network with memory (like a photo RNN or LSTM), it means you can feed the pictures in in any order, even during inference, without a problem. This is where a card with a lot of memory could shine. You would push over a lot of frames, let it run, then pull back the result.

The other thing you have to worry about during a training/fine-tuning is that the k-means for the auto generation of the training set is likely is done by CPU and not by the GPU. That could be a bottleneck for a large dataset. So even if it trains well and the inference/deployment is fast on the GPU, generating the training set could take a very long time.

This message has been modified
Originally posted on Dec 26, 2024 at 9:35:15am
Message modified by AggieWeekendCougar on Dec 26, 2024 at 9:35:33am
Message modified by AggieWeekendCougar on Dec 26, 2024 at 9:36:08am
AggieWeekendCougar
Bio page
AggieWeekendCougar
Joined
Sep 23, 2013
Last login
Dec 27, 2024
Total posts
15,835 (123 FO)