All articles

  1. Some Interaction Information Examples

    Interaction information between three random variables has a much less immediate interpretation compared to other information quantities. This makes it more tricky to work with. An i.i. term \(I(X;Y;Z)\) between \(X,\ Y\) and \(Z\), could be either negative:

    If we are trying to specify \(X\) with …

  2. Point KL-Divergence is not Very Negative Very Often

    If \(X\sim P\) then for any distribution \(Q\) it is unlikely that \(Q\) ascribes much greater density to \(X\)'s outcome than \(P\) does. In fact if \(P,Q\) have PDFs \(f_P, f_Q\), then:

    \begin{align} \mathbb{P}(f_P(X)\leq c f_Q(X)) &= \int \mathbf{1}_{\{x …
  3. Tails of Probability Distributions are Nowhere Dense

    Lately I have been thinking about Kullback-Liebler divergence on probability distributions, also called the KL discriminant or relative entropy. It is often useful to think about this quantity as a distance, even though it doesn't behave much like one: it's not symmetric, it doesn't follow the triangle inequality, and even …

  4. Thoughts on Demosaicing for X-Trans Sensors

    A lot of new Fujifilm cameras use their own brand of 'X-Trans' sensors which have a non-Bayer color mosaic. Fujifilm claims that the less regular arrangement makes it more uncommon for edges in the scene to make moire patterns with the mosaic. They say this justifies not including an optical …