Submitted by phriot t3_ykcjwo in singularity
Thorusss t1_iusxlia wrote
I mean if you want it and look for it, you can buy a radicicolous amount of objects with an additional microchip in it. Heated insole: chip. Camera in glasses: Chip. T-shirt that measures heartbeat: Chip. Ring that measures temperature and movement: chip. Implant for paying: Chip. Light up Shoes: Chip. Jacket with speakers: chip.
I mean many packagings have an microchip in the small security sticker, than many people never notice.
phriot OP t1_iusytqh wrote
From what I read, I took the intent to be not just that everything would have an RFID chip, but that computation would be "everywhere" as opposed to centralized in devices. I assume that Kurzweil and others thought this was a trend due to experiencing the decentralization of computing in their lifetimes from mainframes, to time sharing, to PCs, to the internet. Today, the ability to compute anywhere remains due to the rise of wireless internet access, but the actual computation is actually happening in recognizable devices, if not a very centralized data center somewhere.
The Internet of Things does, of course, exist, but I don't think it's yet at the point envisioned by these futurists 20 years ago. My question is did they miss the mark (i.e. we'll keep computation centralized), or are we early (e.g. applications haven't yet caught up to our ability to infuse reality with chips)?
Edit (catching up with your edit): You do note a number of devices that "could" be smart today. In practice, they aren't, yet. I don't know anyone, personally, with a glucose monitoring t-shirt, kinetic energy harvesting sneakers, or palm-embedded NFC chip. The tech exists, but hasn't spread on the timeframe written about in the books I reference.
sumane12 t1_iutqkbc wrote
Ok I see where you're coming from now, and I think it's a bit of both, computation is more centralised than they expected (let's be honest, computation will always be cheaper in a massive data center) and also they were early as a result of that. We are constantly seeing new streams of data in areas we didn't realize we wanted data from, therefore I believe this trend of increased computation will continue, but the majority of our computation will always be done in massive data centers.
I do feel you are splitting hairs a little here, sure kurzweil and kakus prediction of 2022 isn't exactly how they thought it would be, but the level of computation and smart devices we have compared to 30 years ago is mind boggling.
Viewing a single comment thread. View all comments