a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
blackbootz  ·  2929 days ago  ·  link  ·    ·  parent  ·  post: Has anyone read "Rationality: From AI to Zombies" by Eliezer Yudkowsky

I don’t think it’s possible or preferable to eliminate people sometimes acting irrationality in favor of “total rationality.” Not to redraw our rationalist’s boundaries, but I don’t think being “unemotional” is even the goal.

I think a large goal of rationality is awareness. Awareness of the factors leading to and contributing to decision making. The outcomes of different judgment calls have real-world consequences, like a doctor evaluating a CT scan for cancer, that can be influenced to an astonishing degree by factors that ought to be irrelevant. But not every judgment or decision need be evaluated in such a utility-maximizing way, if only because this evaluative process is tiring and incredibly time-consuming.

But that seems to be what you’re reading in Yudkowsky et al. And I don’t, so there seems to be an irreconcilable difference.

    This is, in essence, the problem with auto-didacts like Yudkowsky and Scott Alexander. They highlight what they believe to be important, and they ignore that which is difficult for them to understand.

How do you know what they find difficult to understand, and then choose to ignore?

I appreciate you bringing this all up. It’s what I asked for, and your thoughts are valued.