Submitted by sideways t3_103hwns in singularity
[deleted] t1_j32qpy5 wrote
Reply to comment by ebolathrowawayy in 2022 was the year AGI arrived (Just don't call it that) by sideways
[deleted]
ebolathrowawayy t1_j32twcm wrote
I just don't see it as novel if a customer asks you to build them a website with a data dashboard. I think the majority of work is cobbling together small pieces of stuff in very slightly new ways and that mostly the value comes from displaying domain data, connecting data to other data or connecting users to other users.
If a majority of software work required novel problem solving then I don't think very popular and widely used libraries like React, Angular, Tableau, Squarespace, Unity, etc. would exist. Today's developer picks a couple of libraries, slaps together some premade components and then writes a data parser for a customer's data and does stuff to it. I really do think the majority of work can be done by following medium articles and stackoverflow posts.
Even gamedev, widely considered to be "hard", is really not that novel. It's composed of a bunch of small pieces of code that everyone uses. Most AAA games don't deviate from typical game design patterns, they innovate by pouring money into small details, like horse balls physics in rdr2 or by hiring 1000 voice actors or by creating hundreds of random "theme park" quests that feel amazing or by doubling the number of 3D assets as the last record holding game. But those aren't actually novel things, they're money and time sinks but they're not difficult to implement.
If we're talking about Netflix-scale then yeah that's still novel and not easily done, but 90% of devs aren't doing that. The reason it's difficult is because there aren't a lot of resources on how to go about doing it at scale and what the tradeoffs are of different stacks. If it was deeply and widely documented like React apps are then it would be trivial for a LLM to do.
I think novel software problems that are difficult to automate would be anything that advances the current SOTA, like advancing ML algorithms, implementations of AI that solve intractable problems (protein folding), really anything that can't be easily googled. (Edit: for near future. Once AGI/ASI arrives, all bets are off).
I think a useful rule of thumb for whether or not something can be automated is that if it's well-documented then it's automatable.
I'm not arguing just to argue and I'm sorry if I come across that way. We've had SW team conversations about this at work a few times and I think about it a lot.
Viewing a single comment thread. View all comments