ChatGPT is all the rage. It even drives some people into a rage. It does some remarkable things, it does some outrageous things, it does some absurd things. Most things it does badly, such as tell you how to build rockets. One thing it does—and I’m not sure which category to put this in—is define religion. It does so implicitly through its responses to various kinds of queries. In doing so, however, it reveals bias in the training sets and bias in the constraints put on it by developers.