de Bruijn, "Uncertainty principles in Fourier analysis" O. Weiss (ed.), Harmonic Analysis, Lecture Notes in Mathematics, 992, Springer (1983) pp. Price, "Generalisations of Heisenberg's inequality" G. von Waldenfels (ed.), Quantum Probability and Applications III, Lecture Notes in Mathematics, 1136, Springer (1985) pp. Bialynicki–Birula, "Entropic uncertainty relations in quantum mechanics" L. Benedicks, "On Fourier transforms of functions supported on sets of finite Lebesgue measure" J. Byrnes (ed.), Recent Advances in Fourier Analysis and Its Applications, Kluwer Acad. Benedetto, "Uncertainty principle inequalities and spectrum estimation" J.S. Beckner, "Pitt's inequality and the uncertainty principle" Proc. If $f \in L ^ $ functions and their Fourier transforms" J. S2CID 18118758.Defining concentration in terms of standard deviation leads to the Heisenberg uncertainty inequality. "Minimum uncertainty for antisymmetric wave functions". "An optimal entropic uncertainty relation in a two-dimensional Hilbert space". "Entropic uncertainty relations and locking: Tight bounds for mutually unbiased bases". "Generalized entropic uncertainty relations" (PDF). Physica A: Statistical Mechanics and Its Applications. "On classes of non-Gaussian asymptotic minimizers in entropic uncertainty principles". "One-parameter class of uncertainty relations based on entropy power". Smith, Extensions of the Heisenberg–Weil inequality. An inequality in the theory of Fourier integrals. "Formulation of the uncertainty relations in terms of the Rényi entropies". "An Entropy-based Uncertainty Principle for a Locally Compact Abelian Group" (PDF). ^ Ozaydin, Murad Przebinda, Tomasz (2004).The Many-Worlds Interpretation of Quantum Mechanics: the theory of the universal wave function. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function and there exist rearrangements of arbitrarily high variance, (all having the same entropy.) The same is not true of variance, however. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. G ( y ) ≈ ∫ − ∞ ∞ exp ( − 2 π i x y ) f ( x ) d x, f ( x ) ≈ ∫ − ∞ ∞ exp ( 2 π i x y ) g ( y ) d y, In 1957, Hirschman considered a function f and its Fourier transform g such that This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |