damattdanman t1_iwga7yk wrote
What do they get these super computers to do? Like what calculations are they running for this kind of power to make sense?
emize t1_iwgbm0r wrote
While not exciting weather predictions and analysis is a big one.
Astrophysics is another popular one.
Anything where you need to do calculations that have large numbers of variables.
asdechlpc t1_iwhlzr3 wrote
Another big one is high resolution fluid simulations
[deleted] t1_iwhvijs wrote
[removed]
atxweirdo t1_iwhowxd wrote
Bioinformatics and ML has taken off in recent years. Not to mention data analytics for research projects. I used to work for a supercomputer center. Lots of interesting projects were going through our queues
paypaytr t1_iwj6zbm wrote
For ML this is useless though. They don't need supercomputers but rather cluster of well efficient GPUs
DeadFIL t1_iwjflz1 wrote
All modern supercomputers are just massive clusters of nodes, and this list includes GPU-based supercomputers. Check out #4 on the list: Leonardo, which is basically just a cluster of ~3,500 Nvidia A100-based nodes.
My_reddit_account_v3 t1_iwjmbs9 wrote
Ok, but why would supercomputers suck? Are they not equipped with arrays of GPUs as well?
DeadFIL t1_iwjpc1o wrote
Supercomputers cost a lot of money and are generally funded for specific reasons. Supercomputers are generally not very general purpose, but rather particularly built to be as good as possible at one class of task. Some computers will have a lot of CPUs, some will have a lot of GPUs, some will have a lot of both, and some will have completely different types of units that are custom built for a specific task.
It all depends on the supercomputer, but some aren't designed to excel at the ML algorithms. Any of them will do wayyyy better than your home computer due to their processing power, but many will be relatively inefficient.
My_reddit_account_v3 t1_iwjshyb wrote
Right. I guess what you are saying is you prefer to control the composition of the array of CPUs/GPUs, rather than rely on a “static” supercomputer, right?
QuentinUK t1_iwgdsbh wrote
Oak Ridge National Laboratory: materials, nuclear science, neutron science, energy, high-performance computing, systems biology and national security.
damattdanman t1_iwgegbh wrote
I get the rest. But national security?
StrategicBlenderBall t1_iwgigpd wrote
Nuclear forensics, nonproliferation modeling, etc.
nuclear_splines t1_iwgik6l wrote
Goes with the rest - precise simulations of nuclear material are often highly classified. Sometimes also things like “simulating the spread of a bio weapon attack, using AT&Ts cell tower data to get high precision info about population density across an entire city.”
Ok-disaster2022 t1_iwiccqm wrote
Well there's numerous nuclear modeling codes, but one of the biggest most validated is MCNP. The team in charge of it has accepted bug fix reports from researchers around the world regardless if they're allowed to have access to the files and data or not, export control be damned. Hell the most important part is the cross section libraries (which cut out above 2 MeV) and you can access those on public website.
I'm sure there's top secret codes, but it costs millions to build and validate codes and keep them up to date, but there's not profit in nuclear. Aerospace the modeling software is proprietary but that's because it's how those companies make billion dollar airplane deals.
nuclear_splines t1_iwid0jw wrote
Yeah, I wasn’t thinking of the code being proprietary, but the data. One of my friends is a nuclear engineer, and as an undergraduate student she had to pass a background check before the DoE would mail her a DVD containing high-accuracy data on measurements of nuclear material, because that’s not shared publicly. Not my background, so I don’t know precisely what the measurements were, but I imagine data on weapons grade materials is protected more thoroughly than the reactor tech she was working with.
[deleted] t1_iwgw7w6 wrote
[deleted]
Defoler t1_iwge3xy wrote
Huge financial models.
Nuclear models.
Environment models.
Things that have millions of millions of data points that you need to calculate each turn
blyatseeker t1_iwhpceq wrote
Each turn? Are they playing one match of civilization?
Defoler t1_iwhsogi wrote
Civ 7 with 1000 random pc faction players on a extra-ultra-max size map and barbarians on maximum.
That is still a bit tight for a supercomputer to run, but they are doing their best.
Broadband- t1_iwgcmvh wrote
Nuclear detonation modelling
0biwanCannoli t1_iwhby3f wrote
They’re trying to play Star Citizen.
johnp299 t1_iwhl0th wrote
Mostly porn deepfakes and HPC benchmarks.
Ok-disaster2022 t1_iwibhfx wrote
For some models instead of attempting to derive an sexy formulation you take random numbers, assign them to certain properties for a given particle and other random numbers to have that particle act. Do this billions of times and you cna build a pretty reliable detailed model of weather patterns or nuclear reactors or whatever.
These supercomputers will rarely be used all at once for a single calculation. Instead the different research groups may be given certain amounts of computation resources according to a set schedule. A big deal at DOE SCs is making sure there isn't idle time. It cost millions to power and cool the systems, and letting them run idle is pretty costly. Same can be said for universities and such.
[deleted] t1_iwj8q21 wrote
[deleted]
My_reddit_account_v3 t1_iwjmtbc wrote
My former employer would run simulations for new models of their products (ex: identify design flaws in aerodynamics). Every ounce of power reduced the lead time to get all our results for a given model / design iteration. I don’t understand anything that was actually going on there, but I know that our lead times highly depended on the “super power” 😅
[deleted] t1_iwikq4p wrote
[removed]
Viewing a single comment thread. View all comments