FallingBruh t1_j9kqkju wrote
Anyone who's super into quantum computing and think it's the future please watch sabine hossenfelders videos on qc. It's insightful.
kalakau t1_j9mfcqd wrote
as a practicing physicist you and others should be aware that sabine is intentionally contrarian in order to generate revenue. she's a populist capitalist, not a practicing physicist, and her content should be understood as entertainment, not necessarily as educational content, and certainly not as academic consensus. she often misrepresents (and in certain cases is entirely wrong about) research in fields she has no expertise in. there are threads at length in r/physics discussing this
Hostilis_ t1_j9niuzt wrote
She absolutely is a practicing physicist lol. She's also a far better science communicator than NDT or Michio Kaku, but she gets vitriol like this, because unlike those two, she's a staunch critic of the particle physicist community. And for good reason.
Betaparticlemale t1_j9md91l wrote
She just likes to shit on everything while conveniently leaving out her own favored unprovable theory. It’s also not entirely about quantum computation. Quantum sensing technology is important too.
[deleted] t1_j9lssp4 wrote
[deleted]
LongLightning t1_ja3xh9m wrote
Sabine is a slightly controversial recommendation. She's quite well known in the physics community for coming across as having very definite opinions about how science should be done and she presents these opinions in her videos in an extremely authoritative way. Because her audience is so broad, her opinions are sometimes taken as received wisdom when they are just opinions. When talking to other Phd's she tends to come off, if I'm generous, incredibly frank, if I'm less generous; rude and dismissive. Being a contrarian, within bounds, benefits her career so she has leaned into it. But the end result is something that is more entertaining then strictly educational. I would be more critical if she was lecturing college students like that. I understand she has to make money but I wish it didn't come at the cost of education.
UniversalMomentum t1_j9l3r1e wrote
Qantum sounds useful for some stuff, but realistically silicon can do so much and will still improve. The limitations right now are clearly programming, not really chips.
People have this way of thinking MORE is always better/useful, but it's not. The easiest thing that gets the job done is the most useful. The simplest design that does the job is better than the complex design that does more than you need. Getting that through to most people is hard, getting it through to a bunch of future tech fans is even harder.
Path of least resistance is the truly proven strategy and that also means path of least complexity. It's kind of like simplifying a math problem is the more premium version of logic than leaving it as complex as possible, but with engineering and cost of operation.
techhouseliving t1_j9m1r58 wrote
I don't think you understand quantum.
DeepState_Secretary t1_j9m7cnh wrote
That’s not the point of quantum computers.
Its not about making faster classical computers. But rather that quantum computers could potentially solve problems and do things that classical computers cannot practically do irregardless of how good they are.
[deleted] t1_j9ntarq wrote
[removed]
Fallacy_Spotted t1_j9mfab1 wrote
To be honest the better hardware has enabled worse software to an astounding degree. So much of it is a hot mess compared to the truly important stuff like bios, switch, and compiler code.
SnapcasterWizard t1_j9nh1hr wrote
Hey, what do you mean I don't need to ship an entire browser-stack just so my chat application can render shit with javascript?!!?
Literature-South t1_j9lfxq2 wrote
We're at the point with chips that they're so small that we're running into issues with Quantum tunneling causing errors. Silicon for chips is really at its limit. Moore's law has slowed considerably because of this.
Deadboy00 t1_j9lh8vl wrote
True. But engineers have come up with some clever ways to get around it and still offer performance gains.
Quantum computing is for problems that don’t have a clear solution. Classical computing isn’t going anywhere even as we look far into the future.
MINIMAN10001 t1_j9mwubg wrote
I mean the whole point of "Moore's law is dead" was that... Moore's law, is in fact dead. It wasn't the end of scaling, but the end of the self fulfilling prophecy which they targeted as the rate of scaling for decades has run its course.
It's not the end of transistor scaling, but instead the end of Moore's law, the golden age has come to a close and odds are the respective companies have already been working years at what they consider to be the solution going forwards.
AMD is looking to stack compute with memory. Nvidia looking into AI based image scaling.
mannaman15 t1_j9n912q wrote
Happy cake day! Also our user names are closer than any other I’ve seen on here. 🍻
anally_ExpressUrself t1_j9nzyko wrote
What is AI based image scaling?
Viewing a single comment thread. View all comments