07.07.2014 Views

uniform test of algorithmic randomness over a general ... - CiteSeerX

uniform test of algorithmic randomness over a general ... - CiteSeerX

uniform test of algorithmic randomness over a general ... - CiteSeerX

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

22 PETER GÁCS<br />

I(x : y) as a deficiency <strong>of</strong> <strong>randomness</strong> <strong>of</strong> the pair (x, y) in terms <strong>of</strong> the expression d µ , with<br />

respect to m × m:<br />

I(x : y) = H(x) + H(y) − H(x, y) = d m×m (x, y).<br />

Taking m as a kind <strong>of</strong> “neutral” probability, even if it is not quite such, allows us to view<br />

I(x : y) as a “deficiency <strong>of</strong> independence”. Is it also true that I(x : y) + = d m×m (x)? This<br />

would allow us to deduce, as Levin did, “information conservation” laws from <strong>randomness</strong><br />

conservation laws. 1<br />

Expression d m×m (x) must be understood again in the sense <strong>of</strong> compactification, as in<br />

Section 5. There seem to be two reasonable ways to compactify the space N × N: we either<br />

compactify it directly, by adding a symbol ∞, or we form the product N × N. With either <strong>of</strong><br />

them, preserving Theorem 3, we would have to check whether H(x, y | m × m) + = H(x, y).<br />

But, knowing the function m(x) × m(y) we know the function x ↦→ m(x) ∗ = m(x) × m(0),<br />

hence also the function (x, y) ↦→ m(x, y) = m(〈x, y〉), where 〈x, y〉 is any fixed computable<br />

pairing function. Using this knowledge, it is possible to develop an argument similar to the<br />

pro<strong>of</strong> <strong>of</strong> Theorem 7, showing that H(x, y | m × m) + = H(x, y) does not hold.<br />

Question 1. Is there a neutral measure M with the property I(x : y) = d M×M (x, y)? Is this<br />

true maybe for all neutral measures M? If not, how far apart are the expressions d M×M (x, y)<br />

and I(x : y) from each other?<br />

The Addition Theorem (Proposition 6.3) can be <strong>general</strong>ized to the <strong>algorithmic</strong> entropy<br />

H µ (x) introduced in (4.1) (a somewhat similar <strong>general</strong>ization appeared in [23]). The <strong>general</strong>ization,<br />

defining H µ,ν = H µ×ν , is<br />

H µ,ν (x, y) + = H µ (x | ν) + H ν (y | x, H µ (x | ν), µ). (6.4)<br />

Before proving the <strong>general</strong> addition theorem, we establish a few useful facts.<br />

Proposition 6.4. We have<br />

H µ (x | ν) + < − log ν y 2 −Hµ,ν(x,y) .<br />

Pro<strong>of</strong>. The function f(x, µ, ν) that is the right-hand side, is upper semicomputable by definition,<br />

and obeys µ x 2 −f(x,µ,ν) 1. Therefore the inequality follows from the minimum<br />

property <strong>of</strong> H µ (x).<br />

□<br />

Let us <strong>general</strong>ize the minimum property <strong>of</strong> H µ (x).<br />

Proposition 6.5. Let (x, y, ν) ↦→ f ν (x, y) be a nonnegative lower semicomputable function<br />

with F ν (x) = log ν y f ν (x, y). Then for all x with F ν (x) > −∞ we have<br />

H ν (y | x, ⌊F ν (x)⌋) + < − log f ν (x, y) + F ν (x).<br />

Pro<strong>of</strong>. Let us construct a lower semicomputable function (x, y, m, ν) ↦→ g ν (x, y, m) for integers<br />

m with the property that ν y g ν (x, y, m) 2 −m , and for all x with F ν (x) −m<br />

we have g ν (x, y, m) = f ν (x, y). Such a g can be constructed by watching the approximation<br />

<strong>of</strong> f grow and cutting it <strong>of</strong>f as soon as it would give F ν (x) > −m. Now<br />

(x, y, m, ν) ↦→ 2 m g ν (x, y, m) is a <strong>uniform</strong> conditional <strong>test</strong> <strong>of</strong> y and hence it is < ∗ 2 −Hν(y|x,m) .<br />

To finish the pro<strong>of</strong>, substitute −⌊F ν (x)⌋ for m and rearrange.<br />

□<br />

1 We cannot use the <strong>test</strong> tµ for this, since—as it can be shown easily–it does not obey <strong>randomness</strong><br />

conservation.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!