Expected value (equivalently, expectation) is the simpler concept. Basically, it's the average amount you'd make per game if you played the game over and over again forever. (If you'd lose money, it's negative.) In the long run, each of the possible outcomes of the game happens as often as its probability says it should happen. If the probability of winning a single lottery game is one in a million, for example, then in the long run you'll win the jackpot in one game for every million games you play. And in all the rest of the games, you'll lose the price of the ticket. Subtracting all those losses from the single win yields your net winnings for a million games, and then dividing that by a million yields the expected value for a single game.I'm having difficulty still with the concept or context of "utility" as it applies here (as well as EV). I've done some looking around, and actually found a paper written on gambling which involved these terms...but the paper was far too advanced in probability theory for a layman.
Does anyone care to try to flesh out the definition of these terms? I feel there's alot being said here that isn't particularly difficult to understand if you speak the lingo...?
So, if you need to decide between playing the game forever or not playing it at all, the choice is easy. If the expected value is positive, play: you'll make money. If it's negative, don't play: you'll lose money.
If you're not going to play the game forever, only a few times, then things get less clear. That's where utility comes in. But I don't like it too much. Maybe that's just because I don't understand it well enough. So I think I'll let someone else explain it. (Then, I'll complain about their explanation.