| |||||
| |||||
|
Важные объявления |
|
22.11.2010, 00:04 TS | #1 (permalink) |
Бессмертный
Регистрация: 03.05.2004
Адрес: Планета Шелезяка
Сообщений: 3,615
|
Покопавшись в форуме, пришел к выводу, что эти вопросы здесь детально не освещены. Кто может помочь с хорошим переводом?
Detailed CE analysis From Stanford Wong's BJ21 Posted By: Karel Janecek <[Зарегистрироваться?]> on 3 Jun 00, 3:39 am An interesting discussion developed here about the notion of Certainty Equivalent (CE). I decided to prepare some detailed analysis to this topic and hope to provide enlightful explanation to BJ community. While the text is somewhat long, I believe that it is worth it for a serious blackjack player to understand in detail CE and gain further insights into the very interesting topic of risk taking. CE is a very useful notion, which provides the intuitive quantitative understanding for accepting risk for greater return (so-called price of risk). The definition of CE is relatively simple: it is the minimum profit (percentage or absolute) one would require to forfeit an advantageous and risky investment opportunity. Equivalently, it is the maximum profit one would be willing to pay for entering such an investment opportunity. Suppose that you face an advantageous investment opportunity (random variable) X. Advantageous means that the expected value E[X]>0, where E is the expectation (arithmetical average) operator. A rational investor will bet betting fraction f to maximize the expected value of some utility function U of his/her resulting capital (bankroll): maximize E[U(C1)], where C1 is the capital after one game, C1 = 1+f*X. We assume without loss of generality that the initial capital is 1 (one unit or dollar or million). A well-known example of a utility function is the Kelly (logarithmic) utility, where U(C) = log(C). See more comments to Kelly utility at the end. CE can now be precisely defined as risk-free profit, which will yield the same expected utility as the investment opportunity X for an optimally betting investor. Thus, we want U(CE) = E[U(1+f*X)], which yield (under certain technical assumptions) CE = invU(E[U(1+f*X)]), where invU is the inverse function. For example, for logarithmic utility we have CE = exp{E[log(1+f*X)]}, since exp is the inverse function to log. These calculations are different for each investment opportunity X. It is possible to do the analysis for a simple +1 win, -1 loss game. It turns out that the optimal Kelly utility bet is 2*p-1, and certainty equivalent CE = 2*p^p*(1-p)^(1-p) where p>1/2 is the probability of winning. It is very useful and sufficiently general and precise to use continuous-time approximation for discrete games. We can assume that the player plays continuously, i.e., in each time instant makes a bet. To intuitively understand the continuous time, imagine that instead of betting once per minute the player makes one million independent bets per minute. For each of the 1 milion bets the payout is one millionth. This means that the edge per each bet is million times smaller, and the risk (standard deviation) is 1000 times smaller (since 1000 is square root of 1 million). Now, making 1 million such bets is for practical purposes equivalent to making one bet per minute, with million times higher edge and thousand times higher risk. The negligible difference is that the player can adjust the bet size according to current bankroll million times per minute. Under certain technical assumptions, it is possible to obtain formula for CE CE(t) = exp{1/2*k*S^2*t}, (1) where k is the famous Kelly fraction, S is so-called Sharpe ratio: the ratio of edge and standard deviation, and t is the time horizon. In our example above, S is the ratio of edge per one bet out of the million divided by standard deviation of the one bet. Time t is then 1 million. In blackjack a famous measure is Don Schlesinger's SCORE, which is an analogy to S^2, and S corresponds to DI (desirability index). Under the same assumptions one can get the formula for investor's capital C(t) at time t: C(t) = exp{k*(1-1/2*k)*S^2*t + k*S*W(t)}, (2) where W(t) is so-called Brownian motion: random walk in continuous time setting. It is interesting to compare certainty equivalent CE(t) and capital C(t). One way of doing it is to compare the ratio of CE(t)-1 (certainty equivalent percentage profit) and E[C(t)]-1 (expected percentage profit). It can be proven that E[C(t)] = exp{k*S^2*t}. (3) (Observe that the random walk W(t) influences the expected value of the capital, even though it is a driftless random walk.) As has been pointed out by MathProf, the CE profit is approximately half of the expected profit. One can see that the exponent in formula (3) is twice as high compared to formula (1). It can be proven that CE profit is half of the expected profit if the edge (and thus Sharpe ratio S) or time t converge to zero. In other words, for an optimally and continuously betting investor the "instantaneous" CE profit is half of the instantaneous expected profit. However, this is not true for longer time horizon. For example, the monthly CE profit will not be exactly half of the monthly expected profit. The ratio will depend on value of S and Kelly fraction k. It can be shown that CE profit is always less than half of the expected profit, and the fraction is increasing and coverging to 0.5 with Kelly fraction converging to zero. While it is interesting to compare CE profit to the expected (arithmetical average) profit, this may be in my opinion somewhat misleading. Above all, the player's objective is not to maximize the expected (average) profit. You may play a game with arbitrarily high expected profit, and yet go broke for sure (with probability 1) in the long run. More reasonable measure is the expected growth rate -- mathematically the expected value of logarithm of your capital. (This is also the geometrical mean.) If the player has positive expected growth rate, his capital will grow with certainty in the long run. From formulas (1) and (2) it can be inferred by taking the logarithm that the CE growth rate equals CEgr(t) = 1/2*k*S^2*t, (4) and the expected (average) capital growth rate equals Cgr = k*(1-1/2*k)*S^2*t. (5) (Observe that the continuous random walk W(t) does not influence the expected growth rate of the capital, while it did influence the expected value of C(t).) From formulas (4) and (5) one can immediately see that the ratio equals 1/2 / (1-1/2*k). This may be considered a very interesting result. The ratio is not dependent on the time horizon, nor on the Sharpe ratio S (or SCORE ratio S^2, for that matter). The ratio is dependent on Kelly fraction k -- it increases with Kelly fraction. For full Kelly, the ratio is 1 -- the Kelly player maximizes his growth rate, and the growth rate is the only consideration for CE. Kelly player does not care about additional fluctuations of the capital growth, the average growth rate is the only consideration. In this sense, full Kelly has no risk aversion. Despite the generally accepted view in blackjack, full Kelly is not a reasonable utility function. This can be demonstrated by some results. In our continuous time setting, the probability that player's capital, betting Kelly fraction k, will ever drop below d percent of your capital (for example d=0.2 means losing 80% of the bankroll) is p(k,d) = d^(2/k-1). Plugging in full Kelly k=1 and for example d=0.1 one can see that the probability of losing 90% (=(1-0.1)*100%) of the current bankroll is whole 10%! The numbers drop very quickly with lower Kelly fractions. For example, for half Kelly the probability of losing 90% of the bankroll is just 0.1%, and it is negligible 0.0000001% for more reasonable fifth Kelly bettor. A reasonable utility function starts probably somewhere around fifth or eight of Kelly, maybe even twentieth of Kelly. For this it is important to note that the ideal definition of bankroll is anybody's investment capital (not necessary currently available in cash). The Kelly fraction should ideally be related to one's whole investment capital, not just some artificial bankroll. It is not completely correct to split your investment capital into more bankrolls for different purposes, and then use higher Kelly fraction. However, most people do this, probably from psychological reasons. [Зарегистрироваться?]
__________________
Не мечи бисер перед свиньями. (Иисус Христос). |
0 |
22.11.2010, 02:34 TS | #2 (permalink) |
Бессмертный
Регистрация: 03.05.2004
Адрес: Планета Шелезяка
Сообщений: 3,615
|
Brett Harris
"There is a short answer and a rather long answer. The short answer is for any advantage player, at a given time, will be betting with a given spread, say $B_min to $B_max, with intermediate amounts for counts in between. There are also the given rules, and it depends on whether they leave at certain negative counts - these will all affect the result. But lets break things up: [assume they true count Hi-Lo in a 6-deck game] 1/ Divide everything by $B_min - so that a $20 to $240 spread (assuming play-all), starting to rise at TC=+2 (say $40) up to $240 at TC=+8 [a bad spread] becomes: Bet 1 unit at +1 or less Bet 2 units at +2 . take your pick < +8 Bet 12 units at +8 or greater. [ok so far...right] This is known as a 1-12 unit spread (or 1-12u for short). 2/ Say the EV per 100 rounds is about $25 (I said is a bad spread), and the SD per 100 rounds is about $750. Firstly convert this to per round, remembering that the EV is divided by 100, but the SD is divided by 10 (ie VAR is divided by 100). So EV/r = $0.25 and SD/r = $75.00. Then doing the division gives: 'ev' = 0.0125u ; 'sd' = 3.75u . The quantities 'ev' and 'sd' are the unit EV and unit SD per round. If you think the 'ev' looks small you are right, but since the bets range between 1-12u (mostly at the low end) the 'sd' is reasonable. In general if B_min is just called $B ($20 in this case), and keeping things per round (the 100 round thing is just confusing) then we simply have: EV = $B * ev SD = $B * sd VAR = $B * $B * (sd)^2 3/ Probability theory for large N, says that you have about one 33% chance of being 1 standard deviation below expectation after SQRT(N)*SD. Now this works for either EV(ev) or SD(sd) as the $B cancels out, so an unlucky player(33%) could be zero (or less!) after N hands if 0 = N * EV - SQRT(N)*SD or N*N*EV*EV = N*SD*SD . Canceling the N and rearranging gives N = (SD/EV)^2 = VAR/(EV)^2 where we get the (famous?) value N0 = (sd/ev)^2 = (SD/EV)^2 which is called the long run index. 4/ You will have to take my word on this, but Kelly Theory says that for a game using unit 1-M spread, with a given 'ev' and 'sd' has what is called an Equivalent Kelly Bankroll (ekb) given by 'ekb' = sd^2/ev = var/ev. It is no coincidence that if we put back the $B, we get the dollar EKB EKB = (SD^2/EV) providing the EV is per round, not 100. So for the example above we get N0 = [(3.75)/(0.0125)]^2 = 90000 rounds (lousy) ekb = (3.75*3.75)/(0.0125) = 1125 and EKB = $20 * 1125 = $22500 (which means if this is your bankroll you have a risk of ruin of 13%. Not a good return on investment. Also notice (it drops out from the math) that EKB/N0 = $0.25 = EV - not a coincidence! Therefore your win rate per hand is simply your Equivalent Kelly Bankroll divided by your long run index. Optimal Betting Theory says that simply by reducing your high bet down to TC=+4, you can halve N0, lower your EKB and increase your win rate at the same time. But there is another trick - for a given 'ekb', you can choose your unit bet $B so that you can match your Kelly Bankroll. Now what happens if you optimise your betting spread (minimise N0), and adjust your $B such that your EKB is $10000??? You guessed it - your win rate EV/r = $10000/N0 ! Then if you wanted to have a win rate per 100 rounds you get EV(100) = ($10000/N0)*100 which just happens to be the SCORE!! I originally came up with N0 with my work on Optimal Spread Theory. Any N0 more than around 20000 is probably unplayable, unless you Wong which really can lower it, and good single deck games should be less than 10000. Don felt that players relate better to win rate rather than long run chances of being ahead, so he invented SCORE - which is really just (1/N0) multiplied by $1,000,000. The main difficulty I have with it is just that players may just consider it a win rate, which can be calculated for any non-Optimal spread (that is to really 'KNOW' a game, you you need to know the ev and sd (or equivalently NO and ekb, not just N0 which you get from SCORE). In other words, you should know the $B Don has used to convert the unit 'ekb' to $10000. He may well do this in his book, you need to check. I believe QFIT has programs like CVCX to compute Optimal Betting Spreads - doing a Google is a good idea. If all this has been a bit messy, you can always go to: [Зарегистрироваться?] and find my two articles under 'Blackjack Betting for Card Counters'. The first 'How-To' article explains the stuff I said above, the second one is really only for the mathematically inclined who may have access to simulator data. Cheers, Dr. Brett Harris."
__________________
Не мечи бисер перед свиньями. (Иисус Христос). |
0 |