Spektre wrote:Your double speak is in are form today. "That is why most people who know the Theorem do not know the theorem". LOL, priceless.

You see what I mean about you just literally changing what people write?

"People who know the theorem often don't know or agree that it has the significance for epistemology that Bayesians claim," would have been a more accurate way of putting it.

Again, why must you straw-man someone's position? Do you not have a response to it?

The problem you have seems to stem from a lay-person grasp of science. I suspected this in a previous thread when you made similar claims. The entire field of probability was not developed to claim information of certainty does not exist. A simple example to illustrate, the roll of two dice. One will often explain the outcome of this roll as a probability. Each of the 6 sides has a probability of 1/6 of occurring. Bayes theorem may be employed to tell me what is the probability that the total roll will equal a "7" given that the first die was "less than 3" for example. From a black box perspective, this is often all the information I need about the system.

It is in no way however meant to conclude that it is impossible to calculate the exact outcome of the role without probability. By knowing the initial states of the dice and the external forces acting upon them, you can with 100% certainty express exactly what the outcome is. The information IS knowable absent probabilistic models, and certainly absent time delayed information. I need not wait for the consequence of the dice roll to indeed know that this role is inherently a "9", or a "4".

Probability in the Bayesian sense is literally a measure of uncertainty, Spektre. This is one of the principle differences between the Bayesian and Frequentist views.

Suppose we use cards instead of dice. The dealer pulls a card from a pack and places it face down on the table. He then asks you, 'What is the probability that the card thus drawn is the Ace of Spades?'

You answer, '1/52, of course.'

Now suppose we make it interesting. The dealer permits us to ask yes/no questions of him.

You ask, 'Is it from a red suit?'

He says, 'No.'

'OK, is it from Clubs?'

'No.'

And so on. After the first question, a rational person would hold that the probability that the card is the Ace of Spades is 1/26. After the second, 1/13. We got some new information about the system and had to update our probabilities accordingly.

If you deny this you are going to lose the game. The person who updates and acts on his new information will be right 1/13 times on average, the person who does not, will be right 1/52 times. See the Monty Hall problem, for example.

Because we have an account of knowledge (knowledge is a probability distribution) and an account of knowledge acquisition (Bayes' Theorem) Bayesians argue that between these two things we in fact have an epistemology. It allows us to measure how degrees of plausibility change in light of new evidence. This is not a new discovery. It's used all the time in e.g. cognitive science where it is literally what is meant by 'rational' and 'calibrated.'

You can reject this account of epistemology if you want but that won't change your probability of being right. That

is determined by evidence.

But whatever, I don't need to make a case for Bayesian Epistemology here. Unless you're going to deny that we ought to change our beliefs in light of new evidence or something, your argument falls.