The first chapter of Gotelli and Ellison discusses probability. Probability is defined as the expected frequency with which events occur. However, this itself is dependent upon the scale of the investigation - the more you understand about a process, the less "stochasticity" there is. They make an interesting point on p. 11 - random/stochastic events are, in reality, processes that we are unable or unwilling to measure. Under this view, everything can be measured if you look close enough - yet this doesn't seem to agree with quantum physics, where the closer you get to a particle, the more random it's behavior and very existence seems to be. Hmmm....
Complex events = sequences of simple events (in coding, = OR, probabilities additive)
Shared events = multiple simultaneous occurrences of simple events (= AND, probabilities multiplicative)
This chapter also discusses conditional probabilities, e.g. P(A|B), and the Bayes theorem, which provides a way of measuring conditional probabilities. Essentially, conditional probability and Bayesian analysis involves narrowing down all possible outcomes of an event based on prior knowledge. So, if there are 10 different possible outcomes of A, and 10 different outcomes of B, but only 3 outcomes of A that also include outcome X of B, then you're limited to those 3 possibilities for A. This seems to me a legitimate method of narrowing down possible outcomes - I don't fully understand the controversy re: Bayesian analysis (given, of course, that the data you're introducing into the analysis are relevant and complete).