Case in point, I thought this particular answer was the silliest:
Are we close to building a machine that can meaningfully be described as sentient?
I'm an empirical kind of guy, and there is just no evidence of an artificial toehold in sentience. It is often forgotten that the idea of mind or brain as computational is merely an assumption, not a truth. When I point this out to "believers" in the computational theory of mind, some of their arguments are almost religious. They say, "What else could there be? Do you think mind is supernatural?" But accepting mind as a physical entity does not tell us what kind of physical entity it is. It could be a physical system that cannot be recreated by a computer.
Okay, the computational theory of mind is not "merely an assumption". It is built on evidence, like any good theory. And it's not "religious" to ask for an alternative theory if someone says a particular theory is crap. If this guy doesn't think that the brain receives input from the environment and performs information processing on that input, then what is his alternative hypothesis?
And I'm not sure what he's talking about in that last sentence, either. Any physical system can be simulated computationally. The fidelity of the simulation is limited by the complexity of the model system and the computational resources available. If what we're interested in is the algorithm executed by the simulated hardware, we should be able to recreate the algorithms processed by the brain. In other words, no, a simulated rainstorm can't make you wet, but a simulated abacus can perform calculations just like a physical one, and a simulated chess player can kick your ass at chess. I don't know of a reasonable theoretical argument for why the function of the brain can't be emulated with a computer.
A reasonable answer to the question would have been: "Probably not, although there are no theoretical roadblocks to prevent it as an eventuality."