Mokebe890 t1_irao9bk wrote
Sure I do agree with you but also I would like to hear reality check.
Will progress always go up? Are we really creating AGI and its not far away? Won't energy crisis bury the dream of singularity?
175ParkAvenue t1_irb0gk7 wrote
Those are fair questions. We don't know for sure if this boom will continue all the way to AGI and the Singularity. But that's what it looks like.
In my opinion only something on the scale of a US - Russia nuclear war could delay the Singularity by a few decades.
Something like the invasion of Taiwan could delay it by a few years maybe.
Localized wars, pandemics or recessions probably will not delay it significantly.
dreamedio t1_ircoti1 wrote
Pretty sure people didn’t foresee delay in space exploration
Professional-Song216 t1_ircs46e wrote
Yea but what problems does space exploration solve?
TheSingulatarian t1_irdauu4 wrote
Resource scarcity.
Professional-Song216 t1_irdb7y5 wrote
I can understand that but what resource are we immediately running out of now?
ttrrraway t1_ire22fp wrote
I think that somehow they are right, if we had already terraformed Mars for example, we could have an overabundance of food and water, and we'd be able to split the pollution in different planets, solving global warming and other problems.
Similarly, there are planets and asteroids with enormous amounts of gold and other metals.
So, yes, in a sense, becoming an interplanetary species could have solved lots of issues.
But, on the other hand, becoming an interplanetary species is a much harder and more expensive task. Creating super intelligent AI appears simpler at first glance, and will also take us to space eventually.
Professional-Song216 t1_ire2anc wrote
I see where you are coming from and I agree with your last statement. I’m of the opinion that issues like energy, disease and intelligence should be solved here before we can reasonably think about explore the stars
langolier27 t1_ird6uzp wrote
Pretty much all of them.
Professional-Song216 t1_ird8lgx wrote
How? We can hardly go anywhere at the moment. Name 3 solid problems it would solve that can’t be solved otherwise.
langolier27 t1_irdbmnw wrote
If we became a multi-planetary species it would solve literally every problem. One of the main reasons for pursuing AGI is to help us advance with space exploration.
sideways t1_ircvo8p wrote
It's possible that war could accelerate things.
Ezekiel_W t1_irb0q5i wrote
Unless nuclear annihilation happens, we will achieve AGI this decade. I can't speak for the rest if the world but here in the US we are the biggest creators and exporters of energy in the world, so I doubt that will be a problem.
slobbowitz t1_irdgiqb wrote
To quote Morrissey, “Come, come, come, nuclear bomb”
dreamedio t1_ircp0kl wrote
This decade? Lol
Ezekiel_W t1_ircxdec wrote
I am honestly surprised anyone who does research into this subject would think otherwise. I am always open to hearing other's ideas on the subject though.
Halperwire t1_ird0qea wrote
Yep Kurzweil says 2029 right?
ScaryPratchett t1_irb37ze wrote
I'd say we're at a quasi-singularity phase in that for the first time things are progressing faster than I anticipated -- I was pretty surprised when I put in a prompt and actually got what I asked for (though it is spotty if you don't game the algorithms correctly). Also, I'm of the opinion some of the skeptics are glossing over what proprietary techs are probably out there on the cutting edge.
arevealingrainbow t1_irdsjjj wrote
>Will progress always go up?
Likely no. Humanity will still be limited by what is physically possible. The exception is if we find a way to create new universes or travel to other ones. Then here likely is no roof to our progress. Either way, this progress cap is so far in the future it is on absolutely nobody’s radar.
>Are we really creating AGI and it’s not far away?
Hard to tell. I am in the camp that AGI will likely happen in the 2060’s because that is the scholarly consensus among machine learning experts. Likely we will achieve many things very early with AI that we wouldn’t have assumed was possible, but also be behind actually achieving AGI.
>Won’t the energy crisis bury the dream of the Singularity?
Probably not. Humanity’s ability to create and output energy is increasing all the time. Considering increasing energy output and a stagnating global population, it likely won’t be an issue. Especially since models are likely to become much more energy efficient.
Lone-Pine t1_ire5szm wrote
> AGI will likely happen in the 2060’s because that is the scholarly consensus among machine learning experts.
They only run these polls every few years. I'm certain if a poll of ML engineers/scientists were run today, the average would be in the 2040s. Most of the more vocal people in the industry (Sam Altman, Demis Hassibis) regularly predict on Twitter very short timelines.
Mokebe890 t1_irdt1uj wrote
Well I mean rather 10 - 20 years ahead not as far in future tho.
May you provide some insight? Im in camp by 2030 or 2030 - 2040 but would like to see some papers about as far as 2060. Looking at current model or following Altman you could think that is a lot sooner.
Good point but for example in Europe combine war in Ukraine and almost every country here will have energy crisis this winter, going as much as closing the schools and facilitie.
arevealingrainbow t1_irdthee wrote
The energy crisis that Europe is facing is a temporary speedbump, like the oil crisis of the 70’s for the US. This will accelerate the transition to green energy. With this accelerating transition and Fusion research, I estimate that pretty much all of Humanity’s energy woes will be entirely eliminated by 2100.
I made my guess of 2060 as a general average of when experts thing that we will create a super intelligence. The “singularity never” crowd is a minority that really lateshifts that estimate.
In 10-20 years, yeah progress will continue to accelerate.
AstronautOk1143 t1_irb4ghl wrote
I don’t think we have much of an energy crisis. It can be easily solved once politicians stop listening to moronic environmentalists that just want to ban everything. Focus on nuclear and other sources wherever it makes sense. Progress will not always go up but we have no idea where the limit lies thus we cannot say it will halt anytime soon, it’s more likely to continue an upward trend that to halt. AGI who the hell knows, however; we don’t need agi for the current technology to severely impact society.
[deleted] t1_irca2uk wrote
[deleted]
AstronautOk1143 t1_ircosoc wrote
nah, that's the reason we have an energy crisis. Environmentalist spotted the problem but they are the worst at coming up with a solution. They are the whole reason Germany got rid of their nuclear power and became russia's bitch. Environmentalist are idealistic and extremely bad problem solvers, incapable of finding middle ground or adopting any reasonable solution. they even complain about wind and solar, come on. we are super close to nuclear Armageddon because they pressure germany's government so much and spineless politicians listened.
[deleted] t1_ircqwkb wrote
[deleted]
Viewing a single comment thread. View all comments