The celebrated Riemann \zeta(s) has the following form, for all s \in \mathbb{C} with real part greater than 1
\begin{equation} \zeta(s) = \sum_{n \geq 1} n^{-s} \end{equation}
Actually, \zeta(s) can be analytically continued over all \mathbb{C} - \{1\}, with a unique pole at 1.
To define Chaitin \Omega, we first have to fix some universal prefix-free partial recursive function U : \{0,1\}^* \rightarrow \{0,1\}^*. This simply means that we fix some computer whose legal programs, encoded as binary strings, have some sort of "end of file" marker: no legal program is the prefix of another legal program. In addition, this computer is universal, i.e., is able to simulate every other computer. Denoting by Dom(U) the set of legal programs of U, the Chaitin's number is
\Omega = \sum_{p \in Dom(U)} 2^{-|p|}
where |p| denotes the length of the program p. This number can be interpreted as a probability. Indeed, if we flip infinitely many times a fair coin, then 2^{-|p|} is the probability that the prefix of length |p| in the generated infinite binary sequence is equal to p. Since U is prefix-free, this also represents the probability that U will execute the program p and halt. Hence, \Omega is the probability that U halts when given the generated infinite binary sequence. It can be shown that the binary expansion of \Omega is a Martin-Löf random sequence.
In [1], Tadaki generalizes Chaitin \Omega, and define, for all 0 < D < 1
\begin{equation} \Omega(D) = \sum_{p \in Dom(U)} 2^{-\frac{|p|}{D}} \end{equation}
Now, I guess everyone has caught it, but let's put bluntly the (not so) funny part. For all (real) s > 1
\begin{align} \zeta(s) &= \sum_{n \geq 1} \left( 2^{\log n}\right)^{-s} \\ \Omega(s^{-1}) &= \sum_{p \in Dom(U)} \left( 2^{|p|} \right)^{-s} \end{align}
Of course, these two quantities are not equal: \Omega(s^{-1}) is a sub-sum of \zeta(s) which does not pick any term with n not a power of 2. We can make them look even more similar by introducing, for every n, the function m(l) = |\{ p \in Dom(U),~ |p| = l\}| counting the number of legal programs of length l.
\begin{align} \zeta(s) &= \sum_{n \geq 1} n^{-s} \\ \Omega(s^{-1}) &= \sum_{n \geq 1} m(\log n) \cdot n^{-s} \end{align}
The definition of \Omega(s^{-1}) really looks like a form of \zeta regularization of the counting function m(\log n). Actually, we could even define, for any universal (not necessarily prefix-free) function C, with m_C(\log n) counting the number of legal programs of C with length \log n
\begin{equation} \Omega_C(s^{-1}) = \sum_{n \geq 1} m_C(\log n) \cdot n^{-s} \end{equation}
provided that the real part of s is sufficiently large. In particular, because the domain of C is not necessarily prefix-free, the function may diverge when s \rightarrow 1.
Question. Does such a regularization give more insight on random strings ?
pb
EDIT: It turns out that Tadaki has already proposed an analogy [2] between \Omega(s^{-1}) and partition functions in statistical physics. Since \zeta naturally arises in statistical physics, the above similarity between \Omega(s^{-1}) and \zeta(s) is not a surprise.
EDIT: Truly, there is something going on here. I found this paper [3] by Calude, introducing zeta numbers for Turing machines.
[1] Tadaki, Kohtaro. A generalization of Chaitin's halting probability \Omega and halting self-similar sets.
Hokkaido Mathematical Journal 31 (2002), no. 1, 219--253. doi:10.14492/hokmj/1350911778. http://projecteuclid.org/euclid.hokmj/1350911778
[2] Tadaki, Kohtaro. A statistical mechanical interpretation of algorithmic information theory. http://arxiv.org/abs/0801.4194
[3] Cristian S. Calude, Michael A. Stay, Natural halting probabilities, partial randomness, and zeta functions, Information and Computation, Volume 204, Issue 11, November 2006, Pages 1718-1739, ISSN 0890-5401, http://dx.doi.org/10.1016/j.ic.2006.07.003.
[2] Tadaki, Kohtaro. A statistical mechanical interpretation of algorithmic information theory. http://arxiv.org/abs/0801.4194
[3] Cristian S. Calude, Michael A. Stay, Natural halting probabilities, partial randomness, and zeta functions, Information and Computation, Volume 204, Issue 11, November 2006, Pages 1718-1739, ISSN 0890-5401, http://dx.doi.org/10.1016/j.ic.2006.07.003.