What's your simplest take on Bayes' law?
Concept behind any ideal should be understood before you use it.
Try using a coin flip (H or T) as start before going to Pick 3,4 ,5 etc.
Bayes law is basically centered around 'Prior">>Conditional Probability of
a Parameter>> next prediction.
Prior may be past data (Hard to locate stage of conditional probability)
Conditional Probability> may be current probability of a parameter/member,
say I flip a coin 10 times with these outcome HHTHTHTTTT, what will be the value for
next Flip? can't tell(equal chance you may say)
Generally, A coin has two sides, so probability of either outcome will be 1/2, Bayes
laws is calculating conditional probability of each member (H or T). So we have
PRIOR data HHHTHTHTTTT, we can calculate prior for H or T
HHHH(4)/10 and TTTTTT/10
So prior at current stage is H=4/10, T=6/10
Now flipping a coin is a independent event, so calculate the current prior will be
4/10 x 1/2(H) and 6/10 x1/2. You may flip 5 more times , more data, priors will change.
IS all about calculating different probabilities of each member set.
P3 has 1000 sets, can you calculate conditional probabilities for each or you can use
the sum parameter as reduction method. P3 has 0 to 27 parameters, you can use the flip analogue to
construct conditional probabilities for each.
Sum Parameter> generally P= 1/27
Take a block of say 30 draws, calculate each sum probability, add the next(overlap),
calculate next P for each member>select top prior Sum> see chart for sum and picks
Good luck