The point about confirmation bias rings true to me. If you start by believing something without any real basis, it is a psychological fact that you will then seek confirmatory facts and reject contrary ones, thereby reducing the possibility of ever correcting yourself. The question then becomes: does it matter? I’m not sure how “morality” fits into the discussion, though. The article sort of skirts that claim, which is the purported thesis of the essay.
I have no problem with people believing without evidence; it's when they insist they are right without evidence that causes a problem. There are a lot of situations in life that force us to choose without sufficient evidence to know what is the best choice. We very often have to move forward not knowing for sure if what we believe is correct really is. What makes the difference is whether we are open-minded or whether we are just damn sure we are right, no matter what.
Unfortunately, this article does not clarify what would constitute sufficient evidence, or what would be a test for sufficient evidence for a belief. If I get to determine what is sufficient evidence, then everything I believe has sufficient evidence to convince me, and therefore everything I believe is always morally right.
We all believe many things without any evidence (if you mean scientific evidence)
We believe that bills and coins have a value; that Belgium exists; that laws apply; that there are human rights; that Emmanuel Macron is Le Président de la République française ....
And the only 'evidence' we have for these beliefs: that other people believe it too (and there are documents written by those other people)
‘it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence’
It's giving 'Justified, True, Belief' a moral slant. I don't think these epistemology terms work well as moral theory, which I think ought to be concerned with the will, rules/principals, intentions/motives and results/consequences.