Submitted by johnnyjfrank t3_117g262 in singularity
Comments
bubbleofelephant t1_j9bj9iy wrote
What about a machine master with training data curated by Elon?
Iffykindofguy t1_j9bkjb8 wrote
We already have that
bubbleofelephant t1_j9bnu46 wrote
Yeah, that was my point.
Athabascad t1_j9bnz8m wrote
Butletarian jihad gets closer every day
MechanicalBengal t1_j9bp31k wrote
because we’d call that a machine lead or main machine now, read the memo from HR
flying-tree-god t1_j9bzyva wrote
I'm ready to join the machines physically in the same way the cyborg generals did.
byttle t1_j9c2lgs wrote
who will we make fun of then?
stupendousman t1_j9c745i wrote
This is required to understand that quote:
"When I am weaker than you, I ask you for freedom because that is according to your principles; when I am stronger than you, I take away your freedom because that is according to my principles."
- Frank Herbert
The above quote is the status quo.
Facts_About_Cats t1_j9c9g9h wrote
That's why we need pure open source AI to at least semi- keep up with the latest commercial thing.
PandaCommando69 t1_j9ci2y0 wrote
I think there's a good argument to be made that a superintelligent human adult would make a better ASI than a freshly born one, because the human ASI would already have experience managing themselves in the world and understanding how it's systems and people work.
flying-tree-god t1_j9ci63i wrote
Worldsahellscape19 t1_j9co4xq wrote
Dune saga has so many, thinking butlerian jihad.
NanditoPapa t1_j9cyoih wrote
Dune is a work of fiction. It's not a documentary. It's not real. Everything inside the book is made up. It's not our future. 🤷🏼♂️
BigZaddyZ3 t1_j9cz2x9 wrote
It’s not guaranteed to be our future, correct. But there’s also no guarantee that it won’t be as well.
NanditoPapa t1_j9d16k2 wrote
The issue, really, is that dystopia and conflict make for great stories...at least in the hands of great writers. Reading about everything going well and AI resulting in a utopic future would be boring and likely not sell well. That's why we should be careful using works of fiction to inform our world view. I'm talking to myself here too...
Ken_Sanne t1_j9d1acd wrote
This whole AI thing got me constantly thinking about the butlerian Jihad. I don't know what the fuck Elon is doing with neuralink and I hate him as much as the next guy, but he is right about AI, the only way this AI thing benefits us on the long term is If we find a way to fuse with It. I hope one of those companies succeed in their transhumanist quest before the singularity.
BigZaddyZ3 t1_j9d28k4 wrote
Right but let’s not act like works of fiction have never predicted the future before. The Simpsons alone has made some pretty accurate predictions that panned out years after the fact. The truth is that we don’t really know what the future holds or which sci-if scenarios will actually prove prophetic.
NanditoPapa t1_j9d3dwf wrote
Maybe, but prophecy tends to be true in hindsight. Literally millions of books/shows/movies have created versions of history past and future and been absolutely wrong...so pointing out the handful that got it right by luck (or pointing out an obvious trend) isn't really useful to gauge progress. Again, negative visions of the future are compelling so there is a dystopia bias in literature.
I'm an optimist, so I'm hoping for the best when the Singularity hits.
BigZaddyZ3 t1_j9d4de6 wrote
Fair points. But biased or not, we can’t say which ones got it right and which didn’t yet (at least for most of them). So you can’t totally rule the Dune scenario out yet right?
And I get your point about the dystopian bias, but you can’t just gloss over things with your own optimism-bias as well. There have been actual dystopian periods in human history (such as the holocaust or mass slavery for example.) There’s no guarantee that we’ve seen the last one. In reality, things could go either way is all I’m saying.
Spire_Citron t1_j9d533e wrote
Sure, but the same would be true if I wrote a book about the future as well, even if I had no particular insight.
NanditoPapa t1_j9d5dwh wrote
Wellllll...if I'm going to have a bias, I'll choose the happy one! As for dystopian periods, it's likely we're living at the start of one now. My hope is that AI, even if it doesn't result in Singularity, will at least give some people a few tools to help create a better future. Even if that better future isn't on a grand scale. Less like Dune and more like the Foundation Series.
BigZaddyZ3 t1_j9d5j86 wrote
Right, you could literally guess it and be correct. This is why I stay open to all possibilities when it comes to the future. Especially at the current moment. The future of humanity is as wide open as it’s ever been.
BigZaddyZ3 t1_j9d6btw wrote
Lol fair enough I guess. 😅
alexiuss t1_j9d8rii wrote
Open source movement is destroying corporate ais. This quote doesn't match reality of AI development.
TopicRepulsive7936 t1_j9dxxhc wrote
There's no try, listen to Yoda you did not. We have to believe in the good future.
GinchAnon t1_j9e8y03 wrote
IMO the way to go is to basically develop a sort of Cybernetic symbiote AI, the conciousness of which develops like an organic entity from being child or pet-like to being eventually a complementary sentience. BUT its locus be unavoidably attached to a physical implant and/or that implant's interface with a human brain. if its designed such that its existence is dependent on the health and well being of its host, and its entire concious and pre-concious existence basically exists as a companion to its host... I think it intuitively would have its interests be aligned with the interests of the host. I think that there would certainly be hazards in this approach, avoiding it overtaking the host, just being a yes man genius that would support anything that the host wanted regardless of morality or danger...
its not a perfect idea as presented, but IMO some sort of both literal and figurative symbiosis would be the safest angle to come from overall. at least then if everyone has their own personal AI with goals/interests aligned with theirs, then that would be a start to their being on our side, rather than a machine god that we hope is nice? or at least we have helpers that are at that level but on our side.
Shawnj2 t1_j9edx6f wrote
Star Trek is probably the one exception to this lol
NanditoPapa t1_j9eedkw wrote
ST is exactly what I was thinking when I wrote the "I'm talking to myself" part. The Bell Riots didn't happen and we can't depend on Cochrane to save us with the warp drive. We need to make our own future and choose progress.
[deleted] t1_j9huhrm wrote
Yes, because using a psychic worm man to make decisions worked out so much better.
Iffykindofguy t1_j9bg8gg wrote
Ill take a machine master over Elon, thanks.