Submitted by init__27 t3_10dggxc in MachineLearning
lostmsu t1_j4lrt8a wrote
Performance/$ characteristic needs an adjustment based on longevity * utilization * electricity cost. Assuming you are going to use card for 5 years at full load, that's $1000-$1500 in electricity at 1$ per year per 1W of constant use (12c/kWh). This would take care of the laughable notion, that Titan Xp is worth anything, and sort cards much closer to their market positioning.
timdettmers t1_j4lvr7i wrote
I like this idea! I already factored in fixed costs for building a desktop computer but the electricity is also an important part of the overall cost especially if you compare it to cloud options.
I am currently gathering feedback to update the post later. I think it's quick to create a chart based on this data and create an update later today.
The main problem to estimate cost is to get a good number on the utilization time of GPUs for the average user. For PhD students, the number was about 15% utilization (fully using a GPU 15% of total time). This means, with an average of 60 watt idle and 350 watt max for a RTX 4090: 60 watt * 0.85 + 350 watt * 0.15=103.5 watt. That is 906 kWh per year or about $210 per year per RTX 4090 (assuming US average is 0.23 cents per kWh).
Does that look good to you?
I think its quick to create a chart based on this data and create an update later today.
Edit: part of this seemed to got lost in editing. Oops! I re-added the missing details.
asdfzzz2 t1_j4nl1tn wrote
> This means, with an average of 60 watt idle and 350 watt max for a RTX 4090
RTX 4090 "idles" (stream at background) at 10-15 watt. 4k144hz monitor might change it, but 60 watt is way too much for GPU only.
init__27 OP t1_j4mavhs wrote
Oh wow, Great to see you here as well Tim 🙏
​
As a Kaggler, the usage for my case varies extensively, if I end up in a Deep Learning competition, for 1-2 months, the usage usually is around 60-100% I would like to say.
​
I know many top Kagglers that compete year around, I would vaguely guess their usage is the highest in %
SearchAtlantis t1_j4ne6jj wrote
For what it's worth I think 15% seems low. Having just finished an MS with Deep Learning in my thesis, over the course of a year I used it about 25% of the time. Test quick shallow for arch and other changes then running arch changes etc at full depths for comparison.
anothererrta t1_j4pagpo wrote
If you go to all this trouble, please keep in mind that electricity prices vary a lot across the world. In some places in Europe people pay twice as much as you assumed above.
Making it clear how you arrive at your value calculation in an updated post (or even making it a dynamic calculator where people can enter their cost/kWh) would be very useful.
lostmsu t1_j4r942j wrote
Do you mind if I use your data to make a webpage similar to https://diskprices.com/ ?
JustOneAvailableName t1_j4pqxnx wrote
> (12c/kWh)
I wish. It is 6x more expensive here...
Viewing a single comment thread. View all comments