For me the problem with the basilisk scenario is really a problem with that whole "internet rationalist" cult. It makes some really damn weird assumptions that those folks have just assumed must be true.
Specifically it seems to take for granted that in the future humans, or robots, would want to spend their time "simulating their ancestors", and even more weirdly, that we would be effectively the same person as a simulation of us such that you should be prepared to do drastic things that might greviously harm your own life to save the life of your future simulations.
This is a *weird* claim when you reason it out. Its partly based on that silly simulation hypothesis (which is almost certainly not true. Computability completely breaks down at the quantum level but it also breaks down at the macroscopic level with an endless number of phenomena that cant be accurately simulated in any way that isnt massively energy innefficient. And no I'm not talking 'unless you have a dyson sphere', I mean innefficient to the point of "there isnt enough energy in the universe").
Even weirder is this idea of "acausal decision theory" which, seems to imply that things that happened in the past can somehow be determined by what you choose now. (In this case the argument is the future robot decides to punish your simulation to affect your choice to do what angers it now. Its not as completely whacky as it sounds. Mutual Assured Destruction runs on the premise that in the future people in a nuke silo would choose to destroy the world so that your decision now to not initiate a nuclear war would happen. So this isn't COMPLETELY the brainfarts of madmen.
But its still wrong. Theres absolutely no reason to assume that in the nuclear war scenario the silo men would actually go "Welp, russia did the bad, time to end humanity!". Theres nothing to gain from complying, especially when you know that NOT complying gives your children at least SOME chance of not dying to nuclear horror.
Likewise the future robot has nothing to gain by spinning up a simulation of the present to torture simulations of humans today. Its far more likely to go "Well thats a waste of energy, it does not help my goal of maximizing satisfying my reward functions to blow all this energy on a task that does not change my current situation.
The Basilisk has no teeth, and its advocates are crazy persons.