1. ## Source Encoders as Channels

It is well known that a rate-distortion-optimal source encoder's output generally doesn't match its source's distribution. This can make some analyses a pain in the neck. For example, say you want to investigate the relationship between a signal that appears in a source, and that signal's appearance in an encoding …

2. ## Some Useful Multivariate Gaussian Information Quantities

Many information quantities for multivariate normal distributions have nice closed forms. Their essential parts usually reduce to logs of minor determinant quotients so they combine nicely. Here's a list of them. All of them are somewhere in Adaptive Wireless Communications by Bliss and Govindasamy.

## Notation

• $[n]=\{1,\dots,n\}$

Say $M$ is an $n\times n$ complex matrix and we would like to learn its determinant. We observe another matrix $\hat{M}$ whose components are each within some factor of $M$, in other words $\hat{M}_{ij}=c_{ij}\cdot M_{ij}$ and each $c_{ij}\in (1- … 5. ## Some Interaction Information Examples Interaction information between three random variables has a much less immediate interpretation compared to other information quantities. This makes it more tricky to work with. An i.i. term \(I(X;Y;Z)$ between $X,\ Y$ and $Z$, could be either negative:

If we are trying to specify $X$ with …

6. ## Porting PostmarketOS to the Motorola Photon Q

PostmarketOS has a goal to make old cell phones useful past their shelf life. Right now their efforts are towards getting Alpine Linux to run on as many phones as possible. The majority their codebase right now is a tool called pmbootstrap that automates parts of porting Alpine. Porting is …

7. ## Point KL-Divergence is not Very Negative Very Often

If $X\sim P$ then for any distribution $Q$ it is unlikely that $Q$ ascribes much greater density to $X$'s outcome than $P$ does. In fact if $P,Q$ have PDFs $f_P, f_Q$, then:

\begin{align} \mathbb{P}(f_P(X)\leq c f_Q(X)) &= \int \mathbf{1}_{\{x …
8. ## Tails of Probability Distributions are Nowhere Dense

Lately I have been thinking about Kullback-Liebler divergence on probability distributions, also called the KL discriminant or relative entropy. It is often useful to think about this quantity as a distance, even though it doesn't behave much like one: it's not symmetric, it doesn't follow the triangle inequality, and even …

9. ## Thoughts on Demosaicing for X-Trans Sensors

A lot of new Fujifilm cameras use their own brand of 'X-Trans' sensors which have a non-Bayer color mosaic. Fujifilm claims that the less regular arrangement makes it more uncommon for edges in the scene to make moire patterns with the mosaic. They say this justifies not including an optical …

10. ## Update: Extracting Information from Noisy Observations

This is meant as a follow-up for the last post's motivation. In summary we have a bipartite graph $G=(U,V,E)$ with bi-adjacency matrix $A$ and with $|U|\approx rk,$ $|V| \approx ck,$ $|E|\approx rck,$ $\mathsf{deg}(u)\approx c,$ $\mathsf{deg}(v)\approx r,$ and a sequence …

Page 1 / 2 »