Jump to content

Bremermann's limit

From Wikipedia, the free encyclopedia

Bremermann's limit, named after Hans-Joachim Bremermann, is a theoretical limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe. It is derived from Einstein's mass–energy equivalency and the Heisenberg uncertainty principle, and is c2/h ≈ 1.3563925 × 1050 bits per second per kilogram.[1][2]

This value establishes an asymptotic bound on adversarial resources when designing cryptographic algorithms, as it can be used to determine the minimum size of encryption keys or hash values required to create an algorithm that could never be cracked by a brute-force search. For example, a computer with the mass of the entire Earth operating at Bremermann's limit could perform approximately 1075 mathematical computations per second. If one assumes that a cryptographic key can be tested with only one operation, then a typical 128-bit key could be cracked in under 10−36 seconds. However, a 256-bit key (which is already in use in some systems) would take about two minutes to crack. Using a 512-bit key would increase the cracking time to approaching 1072 years, without increasing the time for encryption by more than a constant factor (depending on the encryption algorithms used).

The limit has been further analysed in later literature as the maximum rate at which a system with energy spread can evolve into an orthogonal and hence distinguishable state to another, [3][4] In particular, Margolus and Levitin have shown that a quantum system with average energy E takes at least time to evolve into an orthogonal state.[5] This is one of the quantum speed limit theorems. However, it has been shown that chaining multiple computations or access to quantum memory in principle allow computational algorithms that require arbitrarily small amount of energy/time per one elementary computation step.[6][7]

See also

[edit]

References

[edit]
  1. ^ Bremermann, H.J. (1962) Optimization through evolution and recombination In: Self-Organizing systems 1962, edited M.C. Yovits et al., Spartan Books, Washington, D.C. pp. 93–106.
  2. ^ Bremermann, H.J. (1965) Quantum noise and information. 5th Berkeley Symposium on Mathematical Statistics and Probability; Univ. of California Press, Berkeley, California.
  3. ^ Aharonov, Y.; Bohm, D. (1961). "Time in the Quantum Theory and the Uncertainty Relation for Time and Energy" (PDF). Physical Review. 122 (5): 1649–1658. Bibcode:1961PhRv..122.1649A. doi:10.1103/PhysRev.122.1649. Archived from the original (PDF) on 2016-03-04. Retrieved 2013-05-23.
  4. ^ Lloyd, Seth (2000). "Ultimate physical limits to computation". Nature. 406 (6799): 1047–1054. arXiv:quant-ph/9908043. Bibcode:2000Natur.406.1047L. doi:10.1038/35023282. PMID 10984064. S2CID 75923.
  5. ^ Margolus, N.; Levitin, L. B. (September 1998). "The maximum speed of dynamical evolution". Physica D: Nonlinear Phenomena. 120 (1–2): 188–195. arXiv:quant-ph/9710043. Bibcode:1998PhyD..120..188M. doi:10.1016/S0167-2789(98)00054-2. S2CID 468290.
  6. ^ Jordan, Stephen P. (2017). "Fast quantum computation at arbitrarily low energy". Phys. Rev. A. 95 (3): 032305. arXiv:1701.01175. Bibcode:2017PhRvA..95c2305J. doi:10.1103/PhysRevA.95.032305. S2CID 118953874.
  7. ^ Sinitsyn, Nikolai A. (2018). "Is there a quantum limit on speed of computation?". Physics Letters A. 382 (7): 477–481. arXiv:1701.05550. Bibcode:2018PhLA..382..477S. doi:10.1016/j.physleta.2017.12.042. S2CID 55887738.
[edit]