We can forward as a secondary hypothesis that this seems reasonable to Yudkowsky because it is, broadly speaking, his fetish. I wish I could avoid discussing this entirely because it seems uncouth and ad hominem. Yudkowsky has co-authored, with lintamande, a 1.8 million word collaborative glowfic called Planecrash (also titled Project Lawful). He says that "[w]hether an intelligence being "submissive" suffices to make it easy-to-steer is a primary focus of my 1.8M-word BDSM decision theory D&D fic". I am not, in general, happy when people express serious things via 1.8 million word BDSM fanfics, because it makes it seem tawdry to say their ideas are either right or wrong.
Planecrash is set in Yudkowsky's rationalist utopia "dath ilan", whose main distinguishing feature is positive eugenics. Yudkowsky explains that in dath ilan, "except in very exceptional circumstances, if you're unhappy on average for any reason, it is your duty to the next generation not to have kids who might inherit this tendency", and that the dath ilani are more intelligent than us "if only because there's a norm against chronically unhappy people having kids".
Inside the fic itself, his self-insert character at one point reflects, in a passage explicitly framed as a moment of clarity about his own life, that "in dath ilan he would never have had his 144 children. He would have tried to be special and failed and been sad and then maybe gotten an ordinary +0.8sd job and either paid for a child out of that or decided he was too strange and unhappy to have one." In the same scene, Yudkowsky's narration describes Keltham as wanting "to prioritize having sex with his research harem as one of his top goals on his second day in another universe." It does not seem unreasonable to notice that the same person whose 1.8 million word kink fic is built around an author-self-insert protagonist who is rewarded with a "research harem" and 144 children, also believes, in his nonfiction, that the real-world solution to alignment is to spend a few decades breeding or engineering smarter humans. One could reasonably suspect that being fixated on the idea might not be, in this context, entirely logical.