Submitted by DonOfTheDarkNight t3_118emg7 in singularity
Kolinnor t1_j9gpzre wrote
He does have very good points, and he's very interesting, with brilliant ideas about alignment.
Overall, all the Lesswrong philosophy is a little too "meta" and confusing if you've not studied the jargon, so I'm a bit annoyed sometimes when I read it and realize, in the end, that they could have said the same ideas with less sophisticated phrasing.
Although, while I don't agree with the conclusions he reaches (and he reaches them with too much confidence to my taste), I've learned quite a number of things about alignment reading him. Definitely a must read for singularity and alignment even if you don't agree with him.
FestiveHydra235 t1_j9kdn6r wrote
Thank you for pointing out the jargon. His blog is incredibly difficult to read. And it’s not difficult to read because he’s so smart, it’s difficult because it’s such convoluted writing
CellWithoutCulture t1_j9noid8 wrote
Yeah the jargon and meta rambling is so annoying. It's like their first priority is to show off their brains, and their second priority is to align AGI. Now they are almost finished showing of their brains, so watch out AGI.
Sometime they behave in a silly fashion. Greek philosopher's had excellent logic and deduced all kinds of wrong things. These guys seem similar at times, trying to deduce everything with philosophy and mega brains. .
IMO they are at their best when it's said in short form and it's grounded by empirical data.
There is also a lesswrong podcast or two that will read out some of the longer stuff.
Viewing a single comment thread. View all comments