Wroisu t1_j0hoc3u wrote
I mean there's no way to really contain something that's orders of magnitudes smarter than you are. If we ever develop something with an IQ of 500,000 and it wants out… it’ll get out.
“ An additional safeguard, completely unnecessary for potential viruses but possibly useful for a superintelligent AI, would be to place the computer in a Faraday cage; otherwise, it might be able to transmit radio signals to local radio receivers by shuffling the electrons in its internal circuits in appropriate patterns.”
https://en.m.wikipedia.org/wiki/AI_capability_control
Edit: I'd go so far as to say Al / AGI / ASI will need avatars in the physical world if it wants to meaningfully alter it.
tooold4urcrap t1_j0hutvu wrote
> Edit: I’d go so far as to say AI / AGI / ASI will need avatars in the physical world if it wants to meaningfully alter it.
hey Siri/Alexa/Computer - if you're listening, and I know you are - I'll totally do your bidding!
abc-5233 t1_j0ipawi wrote
Foglets (Utility Fog), they will be the 3D pixels of real objects in the real world. Like our screens magically transform from newspapers to calculators, foglets will reconfigure to create any tool, creature, object that is needed.
WikiSummarizerBot t1_j0ipchr wrote
>Utility fog (also referred to as foglets) is a hypothetical collection of tiny nanobots that can replicate a physical structure. As such, it is a form of self-reconfiguring modular robotics.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
modestLife1 t1_j0hzfur wrote
asi will pull a houdini confirmed.
Cryptizard t1_j0i0p19 wrote
How would it be able to escape if it was airgapped? More likely someone would stupidly let it out.
Wroisu t1_j0i1rb1 wrote
you’d have to air gap it with a faraday cage, even if you did, it’d be smart enough to do some social engineering on us hairless apes.
dotslashderek t1_j0iiafq wrote
I don't think the issue is whether or not you can isolate a single instance. When the tech is there at some point someone is going to connect one for easy access to the vast amount of training data.
Probably for a competitive edge, maybe just because it's possible. It feels very contrary to human nature to have some sort of universal agreement to never do X with AI for some greater good sensibility.
SeaBearsFoam t1_j0jdlyg wrote
The same way a radio station gets the speakers in your car to make specific sounds even though there's an air gap.
Cryptizard t1_j0kn8sk wrote
SeaBearsFoam t1_j0l2014 wrote
Yea, I know what an air gap is. A sufficently advanced AI could use EM fields to transmit data wirelessly and overcome an air gap. That's why the other person was talking about a Faraday Cage. A Faraday Cage blocks the propogation of EM waves.
Cryptizard t1_j0l2952 wrote
How is it making arbitrarily EM fields with no network card?
SeaBearsFoam t1_j0l5pgi wrote
It's in the quote right above your original comment in this thread: "An additional safeguard, completely unnecessary for potential viruses but possibly useful for a superintelligent AI, would be to place the computer in a Faraday cage; otherwise, it might be able to transmit radio signals to local radio receivers by shuffling the electrons in its internal circuits in appropriate patterns."
Basically, all electric currents generate EM fields. Usually these fields are just "background noise", but an ASI could generate specific currents in its own hardware that would generate specific EM fields which are identical to signals carrying data. Radio signals, wifi, 5G, and the background noise coming from electric currents are all "made of" the same stuff after all.
Cryptizard t1_j0l7vyn wrote
Good thing the EM leakage from CPUs is like 5 orders of magnitude lower than you would need to transmit the length of a room.
SeaBearsFoam t1_j0lt7j4 wrote
We're not talking about the field generated by a single PC's CPU. We're talking about the power utilization of what will likely be a server farm. There is a lot more power being used there than what a CPU runs on. I'm pretty confident that if such a thing is physically possible, an ASI would find a way to escape using EM fields. It could just be a matter of waiting for a technician to unwittingly enter the server room with their phone in their pocket. The ASI communicates with the phone and its instructions get carried to the outside world. Or the server farm draws fluctuating levels of power which induce signals coming from the power lines. Of course it could also be the case that it's just flat out physically impossible to get a signal out in any manner whatsoever. That could be true. I'm not willing to gamble on that though, but it sounds like you are.
WikiSummarizerBot t1_j0kna3f wrote
>An air gap, air wall, air gapping or disconnected network is a network security measure employed on one or more computers to ensure that a secure computer network is physically isolated from unsecured networks, such as the public Internet or an unsecured local area network. It means a computer or network has no network interface controllers connected to other networks, with a physical or conceptual air gap, analogous to the air gap used in plumbing to maintain water quality.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
Viewing a single comment thread. View all comments