viernes, 31 de mayo de 2024

Quantum entanglement explained by the solution of General Relativity discovered by Miguel Alcubierre

Index

1. Summary: The quantum entanglement, explained through a comprehensive approach Alcubierre metric / Maldacena conjecture / Barbour Parallel Universes

 2.Possible ways in which our Universe (that of ordinary matter) can "ask for help" from the Universe of the quantum vacuum

3.Alternative explanation of the double-slit experiment

4.Alternative Explanation to Nuclear Fusion

5.Alternative explanation to Quantum Electrodynamics

6.Einstein-Rosen wormhole view

7.Vision of the infinite worlds of Everett

8.Implications about String Theory and Supersymmetry

9.Julian Barbour’s view  of  the Big Bang and quantum entanglement

10.Vision on the Megauniverse that respects CPT symmetry

11.Big Bang from a black hole created in a previous Universe, a black hole that respects that resolves the so-called "Information Paradox", through the "Holographic Principle"

12.Cooper's Vision on Superconductivity and Pairs

13.Reversal of the state of a photon (experiment done at the University of Austria)

14.Joint action dark energy – dark matter of the Antiuniverse to cause distortions of space – time in our Universe. Application of the Maldacena Conjecture

15.Artificial nuclear fusion vs solar fusion and quantum tunneling

16.Alternative to the accepted nucleogenesis of the Big Bang

17.Levitation explained

18.Alternative explanation of how to extract energy from vacuum

19.Magnetic resonance, entanglement and state of consciousness in our brain

20.Journeys to the past

21.Black holes grow not only by the matter they absorb, but above all by the dark energy of the vacuum  


ANNEXES

Annex 1. Schrödinger’s wave equation, applied to the case of a quantum tunnel

Annex 2. Wave-particle duality, Louis de Broglie's postulate and wavelengths of an electron and a baseball

Annex 3. Heisenberg's uncertainty principle for energy-time

Annex 4. Double-slit experiment

Annex 5. Feynman road integral

Annex 6. Types of virtual particles

Annex 7. Loop quantum gravity

Annex 8. Vacuum energy. The greatest discordance in the history of science

Annex 9. Einstein's Equation of General Relativity

Annex 10. Dirac equation

Annex 11. Alcubierre metric

Annex 12. Key steps of evolution

Annex 13. Quantum mechanics in biological processes

Annex 14. Quantum tunneling to achieve nuclear fusion in the Sun

Annex 15. Is our consciousness quantum?

Annex 16. String Theory and Supersymmetry

Annex 17. Primordial black holes, MACHOs and WIMMPs

Annex 18. Maldacena  Conjecture / AdS/CFT  Correspondence: Equivalence between  a string theory  or supergravity defined  in a certain class of anti-de Sitter space and a conformal field theory defined at its boundary with dimension less by one.  

Annex 19. A photon has gone back in time.

Annex 20. Cosmological Model of the Big Bang Lambda-CDM (Cold Dark Matter) 

 

1. Summary: The quantum entanglement, explained through an approach that integrates Alcubierre Metric, Maldacena Conjecture and Barbour Parallel Universes 

Starting point: a few key equations

Einstein's General Relativity field equations:


Matter tells space-time how to curve, space-time tells matter how to move (John Wheeler)

Schrödinger wave equation for a free particle:


 In the state of entanglement between two particles, when one of them changes to a definite state, the other instantaneously changes to the complementary state.

The change is instantaneous, even though both particles are thousands of light-years apart. 

Heisenberg's principle of indeterminacy


It is permissible to draw large amounts of energy from the vacuum if done during infinitesimal times, without violating this principle.

Dirac equation:


It is the version of the Schrödinger wave equation that considers relativistic effects.

Predicts the existence of antimatter.

Feynman road integral:


https://naukas.com/2018/06/13/ultimo-articulo-hawking-la-naukas-iii-propuesta-ausencia-frontera/

It affirms that our reality is the sum of all possible realities.

 

My Vision

That instantaneous communication that occurs when two particles are entangled can only be possible because "something" is greatly distorting space-time.

Alcubierre found some solutions of the Theory of Relativity that, through enormous distortions of space-time, allow travel WITH space, not through space, without violating the maximum limit of speed allowed, the speed of light. 

For me, the fundamental conclusion of this paper is the following: through the Alcubierre Metric, taking out the enormous amount of negative energy we need from the vacuum, we curve space-time enough for the instantaneous action that takes place between entangled particles that are separated by enormous distances.

We take those large amounts of energy out of the vacuum (they are negative energies and masses) during infinitesimal times, which allows us not to violate Heisenberg's Uncertainty Principle.

The energies that, in Alcubierre's metric, we need to take out of the vacuum to travel NOT through space but WITH space:

 Energy density needed to arrive at the Alcubierre Metric:


This energy density, taken directly from the equations of General Relativity, is negative and therefore requires "exotic matter" to cause the deformations of space-time sought.

In summary, quantum entanglement is produced by drawing from the vacuum the negative energy density needed by the Alcubierre Metric.

When something needs to be transmitted to the rest of the entangled particles, the Alcubierre energy of the vacuum in the dark Universe is used, which causes a great deformation of space-time in our Universe, allowing the instantaneous transmission of information between the entangled particles, regardless of the distance between them.

The energy we draw from the vacuum is "dark energy."

This energy that we draw from the vacuum creates an enormous gravity, but of sign of repulsion.

The vacuum has created a gravitational repulsion.

This gravity has created a huge warping of space-time, according to Einstein's equations.

In the state of quantum entanglement, we enter the conditions of the Alcubierre metric:

     -There is possibility of instantaneous steps in space (without violating the speed of light as a universal limit)

     -There are possibilities to cross the potential barriers of quantum tunnels

The next question is: what is that world of the vacuum like, which contains that anti-gravitational energy?

My view is that we are facing a black hole that has been created by the collapse of a previous Universe.

But it is NOT an infinitesimal black hole; it obeys the rules of the Theory of Loop Quantum Gravity and, therefore, has a finite radius.

What other qualities would that predecessor black hole of our Universe have?

First, I think it would be more accurate if we call that black hole "the Universe before" our Big Bang.

Another possibility would be that this "dark Universe" was created in the instants immediately prior to the Big Bang of the Universe of particles that we know (that would suppose that there really were two Big Bangs, one of that dark Universe and another that of the ordinary matter that we know).

In any case, the idea of a Big Crunch of a previous Universe has very attractive connotations:

    -Following Julian Barbour’s theory, in that previous Universe time may have gone backwards.

    -We are talking about concepts of thermodynamic time, that is, the one related to entropy

    -According to Julian Barbour, in a universe where time runs backwards, entropy (i.e., disorder) decreases and, therefore, complexity increases.

    -In Julian Barbour's parallel universe, the direction of time is dominated by gravity, not thermodynamics.

    -In short, that Universe prior to our Big Bang would have reached phases / states of great complexity.

On the other hand, Quantum Mechanics tells us that information (that is, quantum states) cannot disappear.

We know, from the Holographic Theory, that the information inside a black hole can be reflected in the event horizon (remember that we are talking about a black hole of finite radius, NOT infinitesimal therefore)

In short, we would be facing a black hole that contains all the quantum information of the previous Universe. 

In short, following Barbour, it would be a mirror Universe to ours, where thermodynamic time went backwards, complexity reached maximum levels and the information of everything that happened to it is recorded in its event horizon.

What can this previous Universe be formed of? antimatter would be a good candidate, which would allow CPT symmetry to be fulfilled?

Candidate particles that exist in that Universe: the SUSY particles of Supersymmetry, the antimatter neutrinos, the Majorana particle (which, as we know, is matter and antimatter at the same time)

In any case, to cause these gravitational effects in our Universe, it must necessarily be very heavy particles.

Summary of this alternative Big Bang:

In a previous Universe, all matter, energy and information accumulate in a black hole with NO infinitesimal radius (according to loop quantum gravity).

In that black hole all the information / complexity of the previous Universe accumulates-à the information has NOT been lost, it can NOT be lost, according to the laws of Quantum Mechanics

Starting point of our Big Bang-à we start from a primordial black hole, which is created before the Big Bang and not a little after (according to the theory of the double Big Bang)

We also start from all the information / complexity of the previous Universe, recorded in the finite Schwarzschild radius of the black hole that will be the father of our Big Bang.

Therefore, before the Big Bang of our conventional matter Universe there was a Universe with the following characteristics:

-Maximum energy, in the form of dark energy

-Maximum information-à all quantum states/lived experiences of the previous Universe are recorded in the finite Schwarzschild radius

Possibility: Evolution may have taken key steps by tapping into information recorded at the event horizon of the dark/earlier Universe

This previous Universe still exists.

The communication between this dark Universe and ours takes place through gravity, although to be more precise we should say that it is through antigravity.

That antigravity, created in the dark Universe but transmitted to our Universe of ordinary matter, is what originates the quantum entanglement affects that we see.  

Last but not least, if we found a relationship between "what calls dark energy to act” and space-time, we would be facing the initial equation of our Big Bang, of all Physics we know.

Perfect hypothesis to explain this first equation: the Maldacena Conjecture, which relates General Relativity and Quantum Mechanics through a duality between an anti-De Sitter Space-Time and Conformal Fields

Based on this partnership, we could explore two ways of working:

-Quantum entanglement produces distortions of space-time

-Distortions of space-time produce quantum entanglement

Possible cause-effect dynamics of this duality:

-A quantum mind and/or a Universe with a high degree of complexity, capable of originating states of entanglement, produces distortions of space-time.

-These distortions of space-time would be capable of producing entanglements in the new Big Bang of ordinary matter.

-That complexity may come from a previous Universe, collapsed into a black hole of finite radius and, therefore, not infinitesimal.

-That complexity/solutions would have been etched into the event horizon of that black hole.

Final conclusion: the information recorded in the black hole of a previous Universe is the origin of the space-time present in our Universe, which governs all existing interactions between ordinary matter. 

 

2.Possible ways in which our Universe (that of ordinary matter) can "ask for help" from the Universe of the quantum vacuum

 The vacuum is full of quantum fluctuations.

These fluctuations create a particle and its antiparticle, long enough for Heisenberg's Indeterminacy Principle to be fulfilled.

Entangled particles emit virtual photons (quantum electrodynamics)

The virtual particles are there, accompanying the real particleà the virtual particles appear and disappear continuously, accompanying the entangled particle (Feynman diagrams)

According to the Theory of Supersymmetry, SUSY, for every particle in our Universe there is another supersymmetric particle, much heavier than it.

My conclusion is that these supersymmetric particles of the dark Universe are what cause the great distortions of space-time in our Universe that cause the instantaneous action that exists between entangled particles in our Universe to occur.

Another alternative explanation to supersymmetric particles would be that this space-time distortion action in our Universe is caused by antimatter neutrinos that separated from the ordinary matter Universe at the dawn of the Big Bang (that antimatter in a separate Universe would be necessary to explain CPT symmetry).

At a certain moment, when the measurement occurs in an entangled particle, its virtual particles cause the Alcubierre Metric to come into actionà they take out an enormous negative energy from the vacuum, deform space-time radically and send instantaneous information to the rest of the entangled particles so that they collapse in the complementary state to that measured in the original particle.

That information that is sent is instantaneous, regardless of the distance at which the other entangled particles areà Version: particle A is instantaneously where particle B- both are in the same space and collapse in coordination.

Same explanation for quantum tunneling---> we draw energy from the vacuum, via Alcubierre metric conditions, to jump the potential barrier.

 

3.Alternative explanation of the double-slit experiment:

-The particle, entering conditions of the Alcubierre Metric, is in both slits at the same time

The particle, entering those conditions, is instantaneously before and after the grid of the double slit.

 

4.Alternative Explanation to Nuclear Fusion

Schrödinger’s wave equation, for the case of a quantum tunnel, has as solution:

http://hyperphysics.phy-astr.gsu.edu/hbasees/quantum/barr.html

 

If we manage to get the proton trying to enter the nucleus into entangled conditions, the following will happen: Alcubierre metricà instantaneous change of position by curvature of space-time.

The proton can enter the nucleus by two methods:

-It crosses the barrier of electromagnetic force in the same way that the tunnel effect works

It does not pass through the space defined by the barrierà it is now at point A (outside the barrier) and because it is in Alcubierre metric conditions, instantly the proton appears inside the nucleus, at point B

 

5.Alternative explanation to Quantum Electrodynamics

In the accepted theory, between two electrons virtual photons are transmitted to repel each other.

http://hyperphysics.phy-astr.gsu.edu/hbasees/Particles/expar.html

 

My Vision

Let's imagine that the virtual photons are there (as we know, the vacuum is not empty, but full of virtual particles)

Electron A is changing the quantum state of the virtual photons it encounters in its path.

Virtual photons cause the conditions of the Alcubierre Metric to appear - the virtual photons altered by the passage of electron A arrive instantaneously where the electron B is à the  virtual photons do NOT move from the range of the electrons and communicate with each other their quantum states through the torsion of space-time originated by the Alcubierre Metric

More direct/revolutionary explanation:

All electrons in the Universe are entangled— sometimes entanglement collapses and under certain conditions it holds. The virtual photons of electron A are there, accompanying the real electron.

At a certain moment, the virtual particles involved cause the Alcubierre Metric to come into action-> they take an enormous negative energy out of the vacuum, deform space-time and send information to the rest of the virtual particles of the electrons that may be in the path of electron A

The entangled particles (the virtual photons) of electron B receive the information instantaneously and repulsion occurs.

All electrons in the Universe (whose properties are identical) are likely to be able to be in entangled conditions.

 

6.Einstein-Rosen wormhole view

In the original explanation, at the center of a black hole the singularity can become a bridge leading from the center of the black hole to another place in the Universe.

In the mathematics provided by Rosen the singularity does not reach zero size or infinite density

Kip Thorne (friend and advisor of Carl Sagan for his film "Contact") explained the mathematical solutions of the equation of General Relativity to make the black hole passable.

https://www.konradlorenz.edu.co/blog/que-son-los-agujeros-de-gusano/

 

To keep the ends of the wormhole open and keep it stable, it is necessary to provide an enormous amount of "exotic energy".

 

My Vision

The entangled particles are linked by the Alcubierre-> Metric by quantum wormholes that are between us and that are NOT the same as those in the distant galaxies of the Universe.

The black/wormholes responsible for the Alcubierre Metric are among us and "live" in the dark Universe.

At the quantum scale, there is a vacuum with negative energy that keeps viable and stable wormholes open.

 

7.Vision of the infinite worlds of Everett

 

https://es.wikipedia.org/wiki/Universos_paralelos

 

According to Wallace, these worlds are emergent structures, that is, "they are not directly defined in the language of microphysics, but that does not mean that they are somehow independent of the underlying physics."

 

My Vision

Those emerging worlds/structures inhabit/are in the dark Universe.

The dark Universe that causes the Alcubierre Metric is formed by structures / large amounts of virtual particles.

These virtual particle structures could be the infinite Everett worlds / universes-> the photon going to the double slit creates, below, in a vacuum, an Everett world that defines the outcome of the experiment.

 

8.Implications about String Theory and Supersymmetry

Supersymmetry, a prediction of string theory, says that all the subatomic particles we know, such as electrons, photons, and gravitons (do they exist?) must have a much heavier equivalent, which are called "S particles".

 

http://www.nocierreslosojos.com/teoria-cuerdas/

The predicted S-particles are so incredibly heavy that, to this day, particle accelerators do not detect them.

 

My Vision

The dark universe is formed by S---> particles that create the Alcubierre Metricà S particles are responsible for the distortion of space-time that gives rise to entanglement conditions.

Particles, in our world, become entangled.

Underneath, in the dark Universe the corresponding S particles achieve entanglement.

What SUSY postulates (Supersymmetry) is that each particle of the Standard Model corresponds to a supersymmetric partner that has the opposite spin.

That is, for each fermion (leptons and quarks), which have semi-integer spin, it corresponds to a boson (which has integer spin) and for each boson (which has integer spin) corresponds to u fermion (which has semi-integer spin)

Therefore, the number of particles predicted by SUSY would be double that in the Standard Model.

As we have said before, S particles are incredibly heavy and would be the ones in the dark Universe.

 

9.Julian Barbour’s view of the Big Bang and quantum entanglement

Bases of the Theory

According to Julian Barbour, the direction of time is governed by gravity and NOT by thermodynamics.

https://www.abc.es/ciencia/abci-bang-pudo-fabricar-futuros-diferentes-202103070858_noticia.html

 

The Big Bang would simply be the state with the lowest level of chaos and entropy (it would come from a previous Rebound)

From there, two Universes are created, one in which we live in which time moves forward (chaos and entropy increase) and another mirror Universe where time moves backwards, that is, where chaos decreases, complexity increases and entropy decreases--à in this Universe does not dominate entropy but the enormous forces of gravity.

 

My Vision

This second Universe is the dark Universe, which distorts space-time, creating the conditions of the Alcubierre Metricà is the one that produces the entanglements and tunneling effects in our Universe.

Conclusion of SUSY particles and Barbour's Big Bang: the dark Universe is made up of S particles (SUSY) and walks towards complexityà seeks complexity, helps complexity whenever our Universe "asks for it"

Simpler explanation: two Big Bangs are created, one for ordinary matter and one for dark matter.

Big Bang of dark matter creates primordial black hole.

In the Big Bang of ordinary matter time goes forward (order decreases) and in that of dark energy/dark matter time goes backwards (complexity increases)

 

10.Vision on the Mega universe that respects CPT symmetry

Bases of the Theory

Our Universe would have an antimatter companion on the other side of the Big Bang:

https://www.epe.es/es/tendencias-21/20220907/universo-tendria-companero-antimateria-lado-75131498

That second universe would be like a twin of ours, but like a mirror image: everything it contains is inverted with respect to ours. Even time, instead of moving towards the future, does so towards the past (although for the purposes of that universe we are the ones who go the other way around).

If that model were confirmed, it would mean that the universe we know and have studied is only one part of a much more complex entity of a complex mega universe, formed on the one hand by our universe and, on the other hand, by another inverse universe.

The authors of this research, Latham Boyle, Kieran Finn and Neil Turok, from the Perimeter Institute for Theoretical Physics in Canada, have reached this conclusion by delving into the weaknesses of the current Cosmological Model.

One of these weaknesses is a small unresolved contradiction: if our Universe is continuously expanding, it would theoretically be violating a fundamental symmetry of nature, called CPT symmetry (for the initials of Charge, Parity and Time).

That symmetry indicates that if the charges, parity, and time of a particle interaction are reversed, that interaction will always behave in the same way (it will be symmetric).

The researchers believe that this is not the case in the Universe we see around us, in which time moves forward as space expands, and in which there is more matter than antimatter. However, that symmetry is fulfilled if this Anti universe also exists.


In the complex mega universe, CPT symmetry would be fulfilled because in one of its manifestations (the Anti universe) not only does time pass in a direction opposite to ours, but it is also dominated by antimatter. The mirror image of both twin universes compensates for possible mismatches.

These authors consider that this symmetric model of a complex mega universe, composed of two mirror universes, is not only consistent with the known history of cosmic expansion, but also provides a direct explanation for dark matter.

On the one hand, that complex mega universe, composed of two opposite universes, can expand and fill with particles without the need for a long period of rapid expansion known as inflation (which perhaps we have mistakenly attributed to our universe), so it would not violate the basic symmetry of nature (CPT).

On the other hand, this complex universe also solves the mystery of dark matter: it would be nothing more than a new type of neutrino, not yet observed, which can only exist in the "other" universe.

 

My Vision

This Anti universe, composed of very heavy particles, neutrinos and/or S particles, is the one that distorts our space-time through the Alcubierre Metric.

This mirror Anti universe interacts with our Universe through gravity and would explain the paradox of the Hubble constant, which indicates that the Universe is expanding faster than calculated in the current Cosmological Model.

 

11.Big Bang from a black hole created in a previous Universe, black hole that respects the so-called "Information Paradox", through the "Holographic Principle"

Bases of the Theory

The holographic principle: the most beautiful advance towards quantum gravity:

https://estudiarfisica.com/2015/06/26/el-principio-holografico-el-mas-bello-avance-hacia-la-gravedad-cuantica/

The thermodynamics of black holes arose because it was necessary to assign entropy to them within a proper theoretical framework: if they did not possess it, it was possible to charge the entire entropy of the Universe by throwing things into a black hole.

It arises by combining black holes, determinism, and quantum gravity. 

Hawking said that what black holes radiated was completely random.

This interpretation of Hawking was NOT respectful of the determinism of the wave function of Quantum Mechanics.

The wave function is deterministicà  if you know the wave function of a system you can deterministically calculate its evolution following the Schrödinger equation, to estimate how this function will be some time later.

Moreover, the wave function propagates all the information in the quantum system, so it is NOT conceivable that it is destructible.

In Statistical Mechanics, when we talk about the entropy of a system, we are counting the amount of information we have about that object / system. 

Bekenstein à the entropy of black holes is an entropy due to quantum entanglement.

Particles outside the black hole and those inside become entangled when the black hole formed.

Bekenstein's formula



S: entropy of the black hole

A: area of the black hole


Proposal of the Holographic Principle: what we see happening on the horizon of a black hole is a perfect representation of what happens insideà we would be seeing the interior of the hole without having to enter it.

Restriction: a certain physical volume of our Universe can NOT contain more information than can be encoded at its boundary

What would be the minimum unit of information, the cosmic bit?

A is the Planck area.

With this reasoning, the proposal was that the horizon of the black hole contained one bit of information for each small enclosure of size equal to the Planck area on its surface. A black hole storing three million bits of quantum information would have to have an area of three million Planck areas, which are tiny.

If we go to a very simple case, such as a black hole of one centimeter radius (which is what the Earth's would measure if it were compressed), the information it could store would be:

To calculate the area of the hole we have used the formula of the area of a sphere.

This is outrageous. A normal computer stores no more than 10 raised to 13 bits, a practically zero amount compared to what we have left. The Earth itself, in principle, requires fewer bits of information to be described than that amount.

Let's take another example raised by Susskind himself, filling the entire observable universe with books. If we consider each character in a book with one bit, a book has approximately 6000 bits per cubic centimeter. The size of the observable universe, on the other hand, is 4 by 10 raised to 80 cubic meters. All that barbarity of books could be welcomed as bits on the border of a black hole of just 7 kilometers:

Consequently, the information in the universe is very poorly concentrated compared to what it might be.

The proposal of 't Hooft and Susskind is known as the holographic principle because it treats the interior of the black hole as if it were a hologram encoded on its surface, in the same way that in science fiction films such as Star Wars the three-dimensional images they use to communicate are encoded in the plane that generates them.

Within the proposal, the axiom is included that in the universe in a volume delimited by a certain area there can be no greater amount of information than a black hole would have with that area and that, analogously, given a certain amount of information can not be compressed more than a black hole would compress. If there is something in the cosmos with a higher density of information than a black hole, the proposal would have to be revised, at least by reducing the size of the bits so that they enter more in less place.

Finally, thanks to the holographic principle, paradoxes ceased to exist in the black hole. The information is not lost because it is constantly recorded at the boundary of the black hole and the two observers come to the same conclusions.

Leonard Susskind threw himself into the adventure explaining this proposal in his article "The universe as a hologram."

Holographic determinism:

From the above it is obvious that the paradox of information disappears. Wave functions propagate and evolve deterministically encoded on the horizon of the black hole for the external observer, while the internal observer will see things enter and later come out fired at some point as thermal radiation, but without having lost their identity.

 

My Vision

In a previous Universe, all matter, energy, and information accumulate in a black hole with NO infinitesimal radius (according to loop quantum gravity).

In that black hole all the information / complexity of the previous Universe accumulates the information has NOT been lost, it can NOT be lost, according to the laws of Quantum Mechanics

Starting point of our Big Bangà we start from a primordial black hole, which is created before the Big Bang and not a little later (according to the theory of the double Big Bang)

We also start from all the information / complexity of the previous Universe, recorded in the finite Schwarzschild radius of the black hole that will be the father of our Big Bang.

As a summary, before the Big Bang of our Universe there were:

-Maximum energy, in the form of dark energy

-Maximum informationà all quantum states/lived experiences of the previous Universe are recorded in the finite Schwarzschild radius

What happens when our Big Bang begins?

-All elementary particles are created

-Nucleosynthesis / nuclear fusions occur without reaching millions of degrees à we pass protons from outside the hydrogen nucleus to the interior, by quantum tunneling

How does this phenomenon of creation occur in our Big Bang? the information of the black hole knows how to create a new Universe and the energy carries it out with the powerful tool of the enormous distortion of space-time that takes place if we enter conditions of the Alcubierre Metric.

Who is the real actor of how will be the evolution of the Big Bang that has just been created? the information / experience accumulated in the black hole / end of the times of the previous Universe?

In the previous Universe, the arrow of thermodynamic time went backwards (Barbour's parallel universe) à that universe was moving towards minimum entropy, maximum complexity.

Finally: these distortions of space-time must have triggered gravitational waves that should be observable.

 

12.Cooper's Vision on Superconductivity and Pairs


https://gefesrsef.wordpress.com/2016/12/11/las-propiedades-emergentes-y-su-papel-en-la-superconductividad/

 

Basis of the theory

Instead of the classical explanation of changes in the ion lattice-> those electrons in the pair take energy from the vacuum, enter the entanglement phase, the two electrons respond to the same wave equationà each of them knows at each moment what the other does and their transit through the lattice of positive ions is much easier than what they would have found separately.

 

My Vision

"Electrons forming Cooper pairs seek help in the dark Universe"à enter conditions of the Alcubierre metric.

In addition, the dark Universe gives you the keys to transit easier through the network of positive ions.

And this is because the dark Universe is made to allow transits towards complexity (Barbour's Big Bang), to seek more complex solutions.

Finally, this request for help, this entanglement causes all the Cooper pairs of the superconductor to be entangledà the formation of Cooper pairs takes place simultaneously throughout the conductor, so the electrical conduction becomes the "march of a small army of Cooper Pairs that there is no one to stop it"

 

13.Reversal of the state of a photon (experiment done at the University of Austria)

Bases of the Theory

If we run back time, we could know all the states that this photon has had throughout its history-à we could reach the moment it was created, near the Big Bang

To make the photon go backwards, we do it with "universal rewinding" protocols.

 

My Vision

We help ourselves from the dark Universe, where time goes backwards, order increases and entropy decreases.

The state of the electron will go backwards if we manage to enter distortions of space-time produced by the Alcubierre metric.

If we manage to reverse time, that is, the states of a photon, could we achieve the same with a cell? --> we would revert its adult cell state to that of a pluripotent cell.

Same dynamic for evolutionary leaps? à  instead of struggle / competition in evolutionary processesà cooperation, which is achieved by "asking for help" from the dark Universe, the primordial hole, the quantum vacuum, which is governed by a backward tempo, by the transition to greater complexity, by the creation of more complex structures.

 

14.Joint action dark energy – dark matter of the Anti-universe to cause distortions of space – time in our Universe. Application of the Maldacena Conjecture

Let's not forget that mass is nothing more than condensed energy.

Dark energy, when it wants to create a constant source of antigravitational repulsion, creates a primordial black hole of negative matter, but when that dark Universe is required in a timely manner so that its energy, for example, helps ordinary matter to enter entangled conditions or to pass through a quantum tunnelà it goes directly to dark energy and does NOT use the primordial black hole.

Thus, the required action would be more flexible and punctual: I go wherever there is entanglement or quantum tunnels or double slit, I do the punctual help / the punctual action and when my mission ends (when the entanglement is undone) I, dark energy, disappear and go to another place where my services are required.

Thus, there would be point jets of dark energy, which would produce, for example, these tunnels.

The world would thus be simplerà dark energy, directly, is what produces the distortions of space-time.

Summary: dark energy creates these distortions of space-time so that entangled particles move instantaneously from one point in the Universe to others

The equation of General Relativity relates matter and space-time.

An equation would have to be found to relate dark energy and space-time.

Perfect hypothesis to explain this first equation that would originate all the Physics we know:  the Maldacena Canon, which relates General Relativity and Quantum Mechanics through a duality between an Espace  and anti  De Sitter and the Fields Compliant

Based on this partnership, we could explore two ways of working:

-Quantum entanglement produces distortions of space-time

-Distortions of space-time produce quantum entanglement

Possible cause-effect dynamics of this duality:

-A quantum mind and/or a Universe with a high degree of complexity, capable of originating states of entanglement, produces distortions of space-time

These distortions of space-time would be capable of producingentanglements in the new Big Bang of ordinary matter.

-That complexity may come from a previous Universe, collapsed into a black hole of finite radius and, therefore, not infinitesimal.

That complexity/solutions would have been etched into the event horizon of that black hole.

Conclusion: the information recorded in the black hole of a previous Universe is the origin of the space-time present in our Universe, which governs all existing interactions between ordinary matter.

 

15.Artificial nuclear fusion vs solar fusion and quantum tunneling

Current theory

Nuclear fusion in the Sun is carried out only by overcoming electromagnetic repulsion with millions of degrees of temperature and immense pressure.

Astrophysicists know that this is not enough and that in the final stages only protons manage to enter the nucleus of hydrogen atoms via tunneling.

Currently, nuclear fusion experiments being carried out around the world are based on the first concept cited.

 

My Vision

We should try to cause tunneling so we don't have to reach those extreme temperatures.

How to cause this quantum tunneling? via torsion of space - time / Alcubierre metric

For there to be torsion of space 7 time we should put in the plasma the conditions of entanglement----> we must "call dark energy" 

Dark energy helps the proton outside the nucleus, via Alcubierre metric conditions, pass the potential barrier and enter the hydrogen nucleus.

Summary: The nuclear fusion that occurs in the Sun and other stars occurs because the protons that want to enter the nucleus "ask for help" to a primordial black hole that was formed long before the Sun

 

16.Alternative to  the accepted nucleogenesis of the Big Bang

Human beings, science, could create quiet Big Bangs by resorting, for the different stages of nucleogenesis of the elements, to "aids" of dark energy / primordial black holes / distortions of space-time

 

17.Levitation explained

First explanation: if we are in conditions of superconductivity, levitation will appear:

Meissner effect

The Meissner effect, also called the Meissner-Ochsenfeld effect, consists of the total disappearance of the magnetic field flux inside a superconducting material  below its critical temperature. It was discovered  by Walter Meissner and Robert Ochsenfeld in 1933 by measuring the flux distribution on the outside of samples of lead and tin cooled below their critical temperature in the presence of a magnetic field. 

Meissner and Ochsenfeld found that the magnetic field is completely canceled inside the superconducting material and that magnetic field lines are ejected from the interior of the material, so that it behaves like a perfect diamagnetic material  . The Meissner effect is one of the defining properties of superconductivity and its discovery served to deduce that the appearance of superconductivity is  a phase transition to a different state.

The expulsion of the magnetic field of the superconducting material allows the formation of curious effects, such as the levitation of a magnet on a superconducting material at low temperature shown in the figure.

https://es.wikipedia.org/wiki/Efecto_Meissner

 

My Vision

We overcome the force of gravity by resorting to dark energy, which is antigravity.

This antigravity is caused by distortions of space-time originating in the dark Universe.

The dark universe communicates with ours by making entanglements between particles and providing anti-gravitational energy to lift things, such as dust particles in the "Interstellar" room.

 

18.Alternative explanation of how to extract energy from vacuum

The option would be to go directly to the vacuum searching, by method similar to the Casimir effect, but taking advantage of the antigravitational effect generated by the Alcubierre Metric to remove from the vacuum the particle of each pair generated in these fluctuations and bring it to the space-time of our Universe.

More powerful option: harness dark energy/space-time distortion to pass electrons of any chemical element from a stable orbit to outermost orbits-- the electron would move from orbit 1 (space A) to orbit 3 (space B) by distortion of space-time produced by our "call" to dark energy.

Then, as always, that electron, being unstable in an excited orbit, would return to the original orbit emitting electromagnetic energy (photons) .... that would have come out of the vacuum.

Final note. If we do this for many electrons, from different atoms of the same element, we can get all the electrons to be synchronized, and go down all at once to their additional stable orbits--- we would be able to reproduce the laser-effect (much more energy, generated by synchronized electromagnetic waves) ... and everything would have come out of the vacuum.

 

19.Magnetic resonance, entanglement and state of consciousness in our brain

According to a BBC report, David López, Doctor of Neurosciences and his team, when a patient who is having an MRI is awake, the signals returned by the human body are stronger when the patient falls asleep.

The "quantum brain," the bold theory that may help solve the mystery of how human consciousness arises:

https://www.bbc.com/mundo/noticias-64065872

David affirms that during the Resonance the protons are entangled "because there is  a function that is  mediating that entanglement and that function that acts as mediator is consciousness"

He continues: "We can't measure it directly, but we measure protons."

The scientist explained to BBC Mundo that "quantum gravity is a purely theoretical world that has not yet been explained experimentally, which wants to unite two theories that a priori are not compatible (quantum mechanics and the theory of relativity). For this they have created the figure of the graviton, which is something that is not known how it is but that would be the bridge between the two theories. "

 

My Vision

This experiment indicates a key fact: consciousness influences the spin state of the protons of the nuclei of the water molecules of our body, which means that there is a direct and demonstrable mind-body connection, through the result of the Resonance.

 It is not necessary to resort to the existence of the graviton to explain this relationship between consciousness and entanglement.

If this relationship exists, and following the basic argument of this document, this will mean that consciousness is the one that "calls" the dark Universe, which creates the distortion of space-time that causes the entanglement of the protons of the water molecules of the body, whose decay originates the signals detected in the Resonances.

More enigmatic conclusion: is it consciousness that controls the dark energy of the Universe of the quantum vacuum?

According to this approach, we should think of an equation of the Big Bang of our Universe in the following terms:

Left side of the equation: consciousness = right side: strong distortions of space-time

Next equation: the General Theory of Relativity-> space-time tells matter how to move (to paraphrase John Wheeler)

Both Universes will be communicated through gravitational waves (as proposed by physicist Kip Thorne in Interstellar)

The effects on our Universe would be antigravitational forces created by the dark Universe.

How would this mind-cell connection of our body be realized?

 

My Vision

Consciousness, under certain conditions, creates a state of entanglement of all neuropeptides in our body.

For this, it has been possible to activate the dark Universe, which through the distortion of space-time, is ultimately responsible for this entanglement.

Consciousness reaches a state of entanglement only with certain thoughts, for example, when we are in a state of inspiration, intense meditation, maximum urgency, etc.

 

20.Journeys to the past

My vision

Quantum states, according to Quantum Mechanics, are not lost.

Each quantum state is a world that has existed.

The worlds originated by each of these key thoughts are still there, in the brain, in the memory.

Under this prism, an "effect" world (a concrete thought) can exist perfectly before a world causes it.

This would mean you can travel through time: through an "effect" thought I can investigate the causes that made it possibleàI see the effect and investigate the possible causes-> I thinkà  Alcubierre effectà  instantaneous travel between different quantum states that have occurredà travel back in time.

We do not travel to the past as people but as quantum states that have occurred (and therefore have been recorded in our brain) in our past.

 

21.Black holes grow not only by the matter they absorb, but above all by the dark energy of the vacuum  

Black holes grow by the dark energy they pull out of the vacuum.

They discover the ultimate source of dark energy, black holes:

https://ecoosfera.com/sci-innovacion/energia-oscura-fuente-agujeros-negros/?utm_content=cmp-true

Until now it was believed that black holes were constantly growing thanks to the immense amounts of matter, they devour due to their gravitational fields, but the observations of Farrah and Croker suggest that it is actually dark energy that generates this growth.  According to their conclusions,  black holes contain vacuum energy and are coupled to the expansion of the Universe, increasing in mass as the cosmos expands. According to this new research, the measured amount of dark energy in the cosmos can be explained by the vacuum energy accumulated in black holes.

"This measurement, which explains why the Universe is accelerating now, offers a beautiful insight into the real force of Einstein's gravity."

 

My Vision

Impressive discovery, made from observations of supermassive black holes living within a group of elliptical galaxies in the early Universe.

It only remains for me to add there is dark energy in all its splendor, in those primordial black holes. 


ANNEXES

Annex 1. Schrödinger’s wave equation, applied to the case of a quantum tunnel.

Blog: "The tunnel effect in detail"

https://tunelcuantico.home.blog/2019/02/16/el-efecto-tunel-a-detalle/

Schrödinger’s wave equation:


Schrödinger wave equation  for each of the three regions:




Solutions of wave functions in regions I and III:


It is an oscillatory expression: they are waves.

Solution in Region II:

It is not an oscillatory expression.


Annex 2. Wave-particle duality, Louis de Broglie's postulate and wavelengths of an electron and a baseball

See post "Wave-particle duality.”

https://es.wikipedia.org/wiki/Dualidad_onda_corp%C3%BAsculo

Wave-particle duality, also called wave-particle duality is a quantum phenomenon, well proven empirically, by which many particles can exhibit typical wave behaviors in some experiments while appearing as compact particles and localized in other experiments. Given this dual behavior, it is typical  of mechanoquantic objects, where some particles can present very localized interactions and as waves exhibit the phenomenon of interference.

According to classical physics there are clear differences between wave and particle. A particle has a definite position in space and has mass while a wave extends in space characterized by having a definite velocity and zero mass.

Wave-particle duality is currently considered to be a "concept of quantum mechanics according to which there are no fundamental differences between particles and waves: particles can behave like waves and vice versa."  (Stephen Hawking, 2001)

This is a fact proven experimentally on multiple occasions. It was introduced by Louis-Victor de Broglie, a French physicist of the early twentieth century. In 1924 in his doctoral thesis, inspired by experiments on electron diffraction, he proposed the existence of matter waves, that is to say that all matter had a wave associated with it. This revolutionary idea, based on the analogy that radiation had  an associated particle, a property already demonstrated at the time, did not arouse great interest, despite the correctness of its proposals, since it had no evidence of occurring. However, Einstein recognized its importance and five years later, in 1929, de Broglie received the Nobel Prize in Physics for his work.

His work said that the wavelength  of the wave associated with matter was:


Wavelengths of the electron and a baseball:

http://hyperphysics.phy-astr.gsu.edu/hbasees/debrog.html


 

If you explore the wavelength values of ordinary macroscopic objects like baseballs, you will find that their De Broglie wavelengths are ridiculously small. Comparing the power of ten for wavelength, will show that the wavelengths of ordinary objects are much smaller than a nucleus. The implication is that for ordinary objects, no evidence of their wave nature will ever be seen, and for all practical purposes they can be regarded as particles.

 

Annex 3. Heisenberg's uncertainty principle for energy-time

See post: "Heisenberg's indeterminacy relation":

https://es.wikipedia.org/wiki/Relaci%C3%B3n_de_indeterminaci%C3%B3n_de_Heisenberg


This form is used in quantum mechanics to explore the consequences of the formation of virtual particles, used to study the intermediate states of an interaction. This form of the indeterminacy principle is also used to study the concept of vacuum energy.

 

Annex 4. Double-slit experiment

View PDF: Sanchez-Jesus Double slit experiment interpretation:

 https://www.gsjournal.net/Science-Journals/Research%20Papers-Mechanics%20/%20Electrodynamics/Download/748

Interpretation of the double-slit experiment

For this interpretation, we will consider particles as the vehicles (transmitters) of energy but these particles are undetectable in themselves (they cannot interact directly with anything). The interactions are not caused by the particle itself but by the force carriers it emits (usually virtual photons). When a detector detects an electron, it doesn't really "touch" it, or even "see" it, but interacts with its electromagnetic field (with its virtual photons). The electron is continuously emitting a cloud of virtual photons surrounding it. The distribution of these photons is defined by a wave function. The detector does not interact with the electron itself, but with its virtual photon cloud. When the electron passes through one slit, its entire photon cloud passes through the two slits so it interferes with itself and is distributed according to the interference pattern. For one of these photons to interact with the screen, the energy of the electron must be used at that point – if that energy is not used there would simply be no interaction. If once an interaction is made by a photon with the screen, another photon tried to interact with the screen again, there would be no interaction since there would be no more energy to allow that interaction to occur. So, every time we send an electron, the screen will show only one point, but the total pattern will be defined by the possibility of interaction (that is, by the distribution of the photon cloud, once it has passed through the two slits, that is, by the interference pattern). This means that the electron, the undetectable particle (but transmitter of energy), could pass only through one slit – or maybe not, we don't really care – but its chances of being detected actually go through both slits (in the form of a cloud of photons that interferes with itself).


Annex 5. Feynman road integral

How our reality can be the sum of all possible realities:

https://culturacientifica.com/2023/04/04/integral-de-caminos/

The equation, though adorning the pages of thousands of physics publications, is more of a philosophy than a rigorous recipe. It suggests that our reality is a kind of mixture, a sum, of all imaginable possibilities. But it doesn't tell researchers exactly how to carry out the sum. So the scientific community has spent decades developing an arsenal of approximation strategies to construct and compute the integral for different quantum systems.

Approximations work well enough that intrepid physicists like Loll now search for the ultimate path integral: one that combines every conceivable form of space and time and produces a universe shaped like ours as a net result. But in this quest to prove that reality is in fact the sum of all possible realities, they face profound confusion about what possibilities should go into the sum.

 

Annex 6. Types of virtual particles

See entry: "Virtual Particle"

https://es.wikipedia.org/wiki/Part%C3%ADcula_virtual

Virtual bosons

In the Standard Model, the fundamental forces are transmitted by the gauge bosons. When these bosons transmit forces they are virtual particles, and they are created in a vacuum. Even in the most perfect vacuum, whether it is the one created in a laboratory, intergalactic space, or the interatomic vacuum, gauge bosons with an extremely brief existence are continuously created. Quantum mechanics predicts that vacuum energy can never become zero. The lowest possible energy of the vacuum is called zero-point energy, and it is precisely this low (though not zero) energy that of virtual particles. This model of the vacuum is called  a quantum vacuum.

The transmission of forces between the different charges of each interaction is described by quantum field theory, which describes how virtual gauge bosons are transmitted through the polarized vacuum between the real charges. 3 Some of these bosons also occur as real particles in different phenomena:

But a question still to be resolved is whether all the massless gauge bosons that exist, including those above exposed as real, are after all virtual. These particles move at the speed of light, and therefore, according  to Albert Einstein's theory of relativity, the  time it takes to propagate between any two points in the universe is instantaneous from the point of view of the particles. So, being the emission and absorption time instantaneous, would they be virtual?

Virtual particle-antiparticle pairs

Not only gauge bosons arise in the quantum vacuum, but   also particle-antiparticle pairs; such as electron-positron pairs, or up quark-antiquark up pairs, etc.

A particle with its antiparticle must always be created, thus preserving the leptonic or baryonic number  (two quantum numbers) of the universe. The particles that arise in this way are virtual because as soon as they appear, they have so little energy that they instantly annihilate each other.

These virtual pairs are used as an explanatory scheme to justify that the zero-point energy of the vacuum is not strictly zero.  In addition, Hawking radiation can receive an intuitive explanation in terms of the creation of these virtual particle-antiparticle pairs.

 

Annex 7. Loop quantum gravity

What is Loop Quantum Gravity?:

https://www.curiosamente.com/videos/que-es-la-gravedad-cuantica-de-bucles

For Quantum Loop Theory, formulated by scientists such as John Baez, Carlo Rovelli and Lee Smolin, space is not continuous, that is, it cannot be divided infinitely: there is a minimum unit of distance. Space-time is "grainy". Think about your TV screen or mobile phone. You can see how a point of light moves from side to side seemingly continuously. If you get close enough you can notice that the screen is divided into tens of thousands of squares that form the image. These squares are called "pixels": they are the minimum unit of the image: they cannot be subdivided further. And a moving point of light can be on this pixel, or on the adjacent pixel, but it can't move "half a pixel."

The proposal of loop Quantum Gravity is that space is also like this: pixelized. Or more properly "quantized", in the same way that energy can only be transferred in packets called "quanta". Not only matter and energy, but space itself has an atomic structure. The minimum distance is called "plank distance", it is millions of times smaller than an electron, and nothing can move over smaller distances.

How is it structured?

The idea is that space-time is structured in networks of tiny curls or loops connected to each other. These networks are called "spin networks", and are studied by a branch of mathematics called "graph theory", which deals with calculating the possible ways in which the vertices and edges of the network are related. A spin lattice represents the quantum state of a gravitational field. And it is not fixed: it is in constant flux.

A purely speculative hypothesis says that subatomic particles could be "knots" or "braids" within the spin lattice. This for example could be an electron, while this could be a positron. Here we have an electron neutrino and this one, an anti-neutrino. And the warping of space-time that manifests as gravity on planetary or galactic scales starts here, on the smallest possible scale. The universe would be an impressively complicated spin web. 

The old idea of space and time as a stage where things happen no longer applies. A spin network is not in time and space: it is space-time itself.

 

Annex 8. Vacuum energy. The greatest discordance in the history of science

Create the vacuum?:

http://www.javierdelucas.es/vaciomedir.htm

Astronomical measurements based on the motion of the solar system and especially of distant galaxies have resulted in a maximum value for the cosmological constant:

| V|<10-56 cm-2

This maximum value implies that the energy density of the vacuum has to be less than 10-9 erg/cm3.  Next, let's see what the theoretical estimates tell us. If we try to express the vacuum energy in Planck units that constitute the fundamental system of units in quantum mechanics, we obtain:

Eplanck=(hc5/G)1/2=1019 GeV

Then we have that the energy density of the vacuum would be:

Pvac=(10 19 GeV)4=10 76 GeV=10114 erg/cm3

This is an immense amount of energy! The discrepancy is therefore 123 orders of magnitude. This value is of an inconceivable magnitude for the human brain. That is why it is said that this theoretical estimate constitutes the largest discordance between theory and experiment in the history of science. 

The calculation of the vacuum energy of the QED

QED (Quantum Electrodynamics) is the simplest but at the same time most successful theory that allows us to apply the principles of quantum mechanics and special relativity to electromagnetism. To calculate the vacuum energy in QED we must quantize the electromagnetic field. When quantizing we obtain the expression:

Pvac=E/V=1/VSum(1/2 hWk)=h/(2pi2c3)§0Wmax(w3) dW=h/(8pi2c3)w4max

This expression leads us to the famous analogy between the electromagnetic field and a quantum harmonic oscillator. In this way the zero-point energy will be the sum of the zero-point energy of each harmonic oscillator.

Wmax is a parameter called cut-off frequency that speaking "roughly" is the value from which the contribution of high-frequency harmonics is considered negligible.  The value to enter in Wmax is under discussion and the estimate of Pvac depends on the chosen value. A reasonable value for Wmax would be one in which electromagnetism ceases to exist as such and unifies with the weak force, that is, the energy at which electroweak symmetry is restored, which is of the order of 100GeV. With this value we get:

Pvac=(100GeV)4=1046 erg/cm3 (55 orders greater than the experimental value).

The calculation of electroweak vacuum energy

In electroweak theory, the energy acquired by particles and quantum fields when symmetry is broken is proportional to the vacuum of the Higgs field. The potential of the Higgs is:

V(Ø)=Vo-μ2Ø2+gØ4.

Where g is the Higgs self-coupling constant. This potential is minimal for

Ø2= μ2/2g therefore V(Ø)=Vo-μ4/4g

Considering that V(Ø) cancels out for Ø=0 we have:

Pvac=-μ4/4g=-gv4=-(105GeV)4= -10 43 erg/cm3 (52 orders greater than the experimental value)

The calculation of the vacuum energy of QCD

QCD (Quantum Chromodynamics) is the quantum theory that is used when we take into account the strong nuclear force, that is, when we study the interior of the atomic nucleus. In  QCD there is a characteristic energy scale called Lqcd which is  the scale at which chiral symmetry is restored and the quark-gluon condensate of the quantum vacuum disappears; for this reason the vacuum energy in QCD is usually considered a prefactor of Lqcd. The estimative calculation then tells us that.

Pvac=(10-3 or 10-2)4= 10 35 or 10 36 erg/cm3 (44 or 45 orders greater than the experimental value) 

The calculation of the cosmological constant according to General Relativity

If we consider the seriousness the problem becomes even more difficult, some will say almost impossible to solve. The gravitational field "creates" particles in a way equivalent to an accelerated reference frame. The Unruh effect  is based on this phenomenon, so that a detector accelerating in an empty space will detect particles continuously. However, there is good news: experiments tell us that when gravity is weak, for example on Earth, the calculations of our quantum theories are correct and therefore we can disregard the contributions of gravity to vacuum energy.

Possible solutions to the problem

As we have seen, the contributions of known fields to vacuum energy are enormous, many orders of magnitude above the experimentally observed value. Listed below are 4 possible solutions to what is considered by many to be the biggest problem in physics:

1º) The existence of new fields and particles that cancel out the enormous excess of estimated energy

Many physicists think that there must be new particles and new quantum fields above the explored range of energies that would contribute to the energy of the vacuum with opposite sign and that could cancel out the immense energy density that our theories predict. Supersymmetry is one of the favorite candidates, however, because supersymmetry is broken at low energies this cancellation would be far from accurate, so the problem persists. The problem is that it is very difficult for a theoretical model to produce an adjustment as immensely accurate as that required. The adjustment would have to cancel out the excess to an accuracy of at least 56 decimal places!

2º) Make a modification of our quantum theories

No one knows how to do this, and they have had unprecedented experimental success.

3º) Make a modification of general relativity

This has the same drawback as the previous one.

4º) Consider that the vacuum does not have any energy density

This solution seems impossible, however, it deserves to be taken into consideration: there is no quantum experiment that can measure this energy since we always measure energy differences.  In addition,  all experiments considered to be due to vacuum energy  (Cassimir effect, Lamb shift of the hydrogen atom, etc.) can be explained as fluctuations of the material objects of the experiment (see for example Schwinger Source Theory).  To consider that the vacuum is the state with 0 energy and 0 momentum would solve at a stroke the problem of the cosmological constant whose value is almost zero. Of course, the possible implications of imposing such a condition on current theories should be studied.

If this were correct, the vacuum would be the first known physical entity that has no energy or momentum and therefore could be "created" in infinite quantity without a net contribution of energy.

 

Annex 9. Einstein's Equation of General Relativity

The Catastrophe of the Void:

https://es.resonancescience.org/blog/la-catastrofe-del-vacio-2

In technical terms, Einstein's field equations are a set of equations (specifically, nonlinear partial differential equations) that can be expressed in a synthesized way as a single equation:


where the first subscript μ (mu in Greek) represents the coordinates of spacetime and the second subscript ν (nu in  Greek) represents the coordinates of momentum (i.e. the change of space-time coordinates – in simple terms, position – with respect to time).  G is the  gravitational constant, c is  the speed of light, R μν is called the  Ricci curvature  tensor, g μν is called the  metric tensor is the scalar curvature, and T μν is called the stress-energy tensor. This equation includes the constant Λ, known as the cosmological constant, to account for an additional source of energy. Λ represents an additional force of expansion (dark energy). The figure a little further down shows the terms of the above equation and their meaning.

The existence of dark energy and dark matter were deduced so that Einstein's field equations could correctly predict the expansion of the universe and the rotation speed of galaxies. According to this view, dark energy is the source of an expanding force in the universe (this is what explains the Hubble constant in mainstream theories), while dark matter provides a source of additional gravity needed to stabilize galaxies and galaxy clusters, since there is not enough ordinary mass to hold them together given the accelerated expansion of the Universe. This extra gravity would also explain the rotational speed of galaxies.  


Broadly speaking, the left side of the equation in the figure above expresses the geometric deformation of spacetime produced by the energy-mass contribution on the right side of the same equation. This warping of space also explains the gravitational waves recently detected by LIGO in 2015 and emanating from the merger of two black holes.

As physicist John Wheeler states, "space-time tells matter how to move; Matter tells space-time how to curve."

 

Annex 10. Dirac equation

https://significado.com/ecuacion-de-dirac/

Apply quantization rules on a four-dimensional vector function.

Quantization rules lead to operations with derivatives that normally act on a scalar wave function, but since the constants α and β are 4X4 matrices, the differential operators will act on a four-dimensional vector Ψ, which was later called the spinor.

If a system of measurement is chosen in which the speed of light is 1, the Dirac equation is written as follows:

In the above equation a sum is expressed over the indices μ, starting from 0 to 3, and of course, "i" is the imaginary unit, since it is an equation in complex variable.

This equation is usually further compacted by using the symbol ∂ crossed by a forward slash / to symbolize the sum of derivatives, so it is:


 That expression is what has remained as "equation of love".

The solutions of the Dirac equation and electron spin

The Dirac equation is an equation of eigenvalues corresponding to the possible energies. There are two solutions with positive energy, one for each spin state, and two solutions with negative energy, also for each of the two possible spin states.

It is noteworthy that spin, in the Dirac equation, appears naturally, as a result of its possible solutions and as a direct consequence of taking into account the relativistic energy.

Thus, for the first time in physics, it is realized that spin, an intrinsic property of the electron and other elementary particles, is a consequence of relativity. By the way, this property of the electron had been proven before Dirac formulated his equation, thanks to the famous experiment of Stern and Gerlach in 1922.

The Dirac equation predicts the existence of antimatter.

Dirac was incredibly brilliant at having obtained his equation, ingeniously applying mathematics, and it is also remarkable the way he interpreted his solutions.

At first, it was not clear to Dirac whether there were electrons with negative kinetic energy. He then theorized the following:

The vacuum (the absence of electrons) is not such but is filled with electrons with negative energy in their two spin states.

What happens is that scientists do not have the possibility of seeing these electrons, in the same way that fish in the sea are not normally seen, hence the name Dirac Sea.

Now, if a photon can deliver enough energy to one of the electrons in that sea, then it will be visible, appearing out of nowhere.

But the vacant space in the Dirac Sea is a positively charged hole, that is, a particle of the same mass and charge as the electron, but positive, called a positron.

Shortly after Dirac's performance, in 1932, Carl D. Anderson experimentally detected the positron.

 

Annex 11. Alcubierre metric

Alcubierre Metric:

https://es.wikipedia.org/wiki/M%C3%A9trica_de_Alcubierre


Graph of the Alcubierre drive, showing the opposite, contracted and extended regions of space-time with respect to the central sector in which the flat deformation bubble is located.

 

The Alcubierre metric is a speculative idea based on a mathematical model that  would make possible travel at speeds greater than c (speed of light), that is, superluminal. Based on some of the theoretical but probable instances studied  of space-time, Alcubierre proposes the metric that bears his name as a solution to Einstein's equations within the framework of the General Theory of Relativity.

It was published in the scientific journal Classical and Quantum Gravity1 in 1994 by Mexican physicist Miguel Alcubierre.

Index

General idea (the Deformation Impulse)

Alcubierre's metric has, as one of its most striking conclusions, the possibility of a trip faster than light by creating a plane deformation bubble within which the cosmoship would be stationary; behind  the cosmoship space-time would be deformed extending it while on the counterpart in front of the cosmoship space-time would be contracted or contracted thus putting the destination point much closer,  While "behind" the spaceship space-time would be expanded "pushed" back a lot  of light years, all this without the space and time within the bubble of flat deformation in which the cosmoship would be found to be notoriously modified.

In such a case the ship (to make an analogy) would "surf" on a kind of spacetime wave within the "flat deformation bubble" that is flat because it remains stable between the two distortions (the previous and the posterior) caused in space-time (a local distortion of space-time would be created).

There would be enormous tidal forces in the  peripheral region of the supposed bubble due to the curvatures caused in space-time; however, such forces would be negligible inside the bubble given the flat character that space-time would have there (see graph).

No physical law of those foreseen by the theory of relativity would be violated since within the "deformation bubble" nothing would exceed the speed of light; The ship would not move within such a bubble but would be carried by it, the ship inside the bubble would never travel faster than a beam of light.

The ship and its presumed crew would be exempt from suffering the devastating effects caused by accelerations with their corresponding enormous g-forces, decelerations or relativistic effects such as Lorentz contraction and time dilation at high speeds. Alcubierre has been able to demonstrate that even when the spacecraft is accelerating it travels in a geodesic free fall.

However, the fact that the warp bubble allows superluminal travel  is due to the possibility that the space-time itself in which light travels has the ability to exceed the speed of light. The Theory of Relativity considers it impossible for objects to travel at a speed greater than that of light in space-time, but  it is unknown at what maximum speed space-time can move; it is hypothesized that at almost the initial instant of the Big Bang our universe possessed superluminal exponential velocities  (see Inflationary universe), it is also assumed that some quasars  Very far away they can reach transluminal recession speeds.

Here another analogy is introduced: there is a maximum speed at which an object can march on the ground, but what would happen if it is a mobile floor – such as a conveyor belt – that exceeds the speed of the march? This involves a change in the coordinate system used as a reference to measure velocity. If the coordinate system moves in the same direction of displacement with respect to a second reference frame (which should be external to spacetime itself), the object should be able to increase its velocity indefinitely with respect to the second reference frame. What this analogy raises is whether it would be possible to "ride on a ray of light"?

To create a device such as the deformation bubble that  allows  the deformation impulse explains Alcubierre — it would be necessary to operate  with matter of negative density or exotic matter, thus creating with such matter a bubble of negative energy  that would encompass the ship (see Dirac, Negative energy ). According to Alcubierre, the amount of negative energy would be proportional to the speed of propagation of the strain bubble, verifying that the distribution of negative energy would be concentrated in a toroidal region  perpendicular to the direction in which the plane bubble would move (see illustration).

In this way, since the energy density  would be negative, one could travel at a speed greater than that of light thanks to the effect caused by exotic matter. The existence of exotic matter is not ruled out, rather  the Casimir effect seems to confirm the existence of such matter; however, producing enough exotic matter and conserving it to perform a feat such  as superluminal travel  poses the same currently unsolvable problems as to keep a wormhole stable. . On the other hand, in General Relativity a feasible distribution of matter and energy is first specified and then an associated space-time geometry is found; while it is possible to operate with Einstein's equations by first specifying a metric and then finding the energy and momentum tensor associated with such a metric (which is what Alcubierre did), this practice means that the solution could violate several energy conditions and require the exotic matter.

Robert J. Low,  in 19992 has proved that within the context of general relativity and even in the absence of exotic matter it is possible to construct a bubble of deformation (the texts in French use as equivalent of bubble of deformation the words «commande de chaîne»/ order of chain). A coherent theory of quantum gravity may serve to resolve these questions.

In addition, Alexey Bobrick and Gianni Martire claim that, in principle, a class of subluminal and spherically symmetrical factorial impulse spacetimes can be constructed based on physical principles currently known to mankind, such as positive energy.  3

Form of the metric[edit]

The Alcubierre metric can be written:


The energy density needed to cause that metric tensor is:


Thus the energy density is  negative and exotic matter is therefore required to  cause the deformations of space-time.  4

Other denominations

The system assumed by Alcubierre for cosmic travel is called in English "Warp Drive" (the same name given in  the Star Trek series), the translation is: Deformation  Impulse or Deformation Impulse or Driven Distortion, there are also the following translations: Torsion Impulse, Warp Impulse, Curved Impulse, Deformative Impulse,  Curved Travel, Warped Travel, Bending Motor and even Factorial Impulse Motor. All these denominations give the notion of the basic principle of this hypothetical method of "superluminal" travel: instead of accelerating an object (suppose the cosmoship) to speed  c or  close  to c, the "fabric" of space-time would warp or curve so that the objects to which it travels approach without a movement of the ship in the usual sense of the term movement:  More than moving the ship -in these hypotheses-, spacetime is moved (curved).

See also:

References

    1. Alcubierre, M. "The Warp Drive: Hyper-fast Transluminic within General Relativity", Classical and Quantum Gravity, 11(5),L 73-77 (1994)
    2.  Low, Robert J. (1999). "Speed Limits in General Relativity". Class. Quant. Grav.  16: 543-549. See also the «eprint version».  arXiv. Retrieved June 30, 2005.
    3. Bobrick, Alexey; Martire, Gianni (April 20, 2021). "Introducing physical warp drives".  Classical and Quantum Gravity 38 (10): 105009.  Bibcode:2021CQGra.. 38J5009B.  ISSN 0264-9381S2CID 231924903.  arXiv:2102.06824.  doi:10.1088/1361-6382/abdf6e.
    4. ↑ "Christopher Pike":The existence of exotic matter is not theoretically ruled out, the Casimir effect and the Accelerating Universe both lends support to the proposed existence of such matter. However, generating enough exotic matter and sustaining it to perform feats such as faster-than-light travel (and also to keep open the 'throat' of a wormhole) is thought to be impractical. Low has argued that within the context of general relativity, it is impossible to construct a warp drive in the absence of exotic matter

Bibliography

  • Michio Kaku: "Physics of the impossible". Chapter "Faster than light".
  • (in English) Lobo, Francisco S. N.; & Visser, Matt: Fundamental limitations on 'warp drive' spacetimes | Class newspaper. Quant. Grav. | year=2004 | volume=21 | pp.=5871-5892 see also [1]
  • (in English) Hiscock, William A.: Quantum effects in the Alcubierre warp drive spacetime, Class periodical. Quant. Grav. | year=1997 | volume=14 | pp.=L183-L188}} http://www.arxiv.org/abs/gr-qc/9707024 | accessmonthday=23 June | accessyear=2005}}
  • L. H. Ford and T. A. Roman (1996). "Quantum field theory constrains traversable wormhole geometries". Physical Review D: 5496. v.t. also the «eprint».  arXiv.
  • (in English) Berry, Adrian (1999). The Giant Leap: Mankind Heads for the Stars. Headline.  ISBN 0-7472-7565-3..
  • T. S. Taylor, T. C. Powell, "Current Status of Metric Engineering with Implications for the Warp Drive," AIAA-2003-4991 39th AIAA/ASME/SAE/ASEE Joint Propulsion Conference and Exhibit, Huntsville, Alabama, July 20-23, 2003

External links

 

Annex 12. Key steps of evolution

The 10 most relevant evolutionary steps:

https://tallcute.wordpress.com/2010/07/05/los-10-saltos-evolutivos-mas-complejos/

The evolution of species throughout their history has allowed the appearance of impressive qualities to living beings. In this post I would like to review what I believe are the 10 most relevant changes that have occurred in the history of life on Earth since the first living beings appeared. Evidently these steps were all very gradual and it is difficult to limit them to "one step". The list is ordered in chronological order of appearance starting from the first replicating beings whose specific characteristics we can only speculate today:

1-Fidelity in DNA copying

A bacterium today makes a mistake in copying DNA every 10E10 generations approximately.  This ratio between mutations and fidelity allows adaptations but limiting to accumulate large errors quickly that would end the species.  The main architect of this evolutionary wonder is called DNA polymerase that alone is able to faithfully copy several thousand DNA bases before making a mistake. The more advanced versions that appeared later in the evolution of eukaryotes also have revision mechanisms to minimize the errors made. Its need for life is such that there are no living beings that lack this mechanism. Only some viruses like HIV that in return use the perfect cellular machinery.

2-The scourge


Bacterial flagellum

From waiting for the food to arrive, to going to p0r her. This is one of the main changes brought about by the scourge. Although previously bacteria developed small filaments (cilia) that allowed some movement, the truth is that these were totally subjected to the forces that govern Brownian motion: Imagine that you are inside a pool full of marbles that propel themselves at full speed in all directions. The scourge also meant an improvement in the ability to colonize new and distant environments or to escape adverse circumstances. You can watch a video about the evolution of the flagellum here where its appearance is postulated from an organulo intended for subjection.

2-The photoreceptor

And there was light. The ability to recognize light initially implied access to food (the synthesis of many organic compounds is catalyzed by light) and a guide to movement (defined up and down). However, this small advance would sow the seed for two future mechanisms of great relevance: photosynthesis and vision. Photoreceptors are based on pigments capable of being excited by light and transmitting this excited state to some protein.

3-Photosynthesis

Who needs food when you can make it? This is perhaps the most impressive evolutionary leap: the ability to produce organic compounds from much more abundant inorganics. These reactions require great energy that living beings obtain from heat, degradation of other organic / inorganic compounds or light. You can read more about photosynthesis in this other post I wrote. Photosynthesis could not be possible without photoreceptors that also probably coevolved with the improvement of the flagellum. None of these "houses of cards" would have endured without fidelity in DNA copying.

4-The Krebs cycle and oxidative respiration

Photosynthesis brought with it a new era of problems or opportunities depending on how you look at it. The main waste of photosynthesis is oxygen. A molecule that now seems innocuous to us but when it appeared it was like living in a sea of arsenic. Oxygen has the ability to oxidize DNA and proteins and interfered with many of the reactions needed by bacteria at the time. The emergence of atmospheric oxygen was probably a rapid process that ended with the stroke of a pen with most species. Some species (including those producing oxygen) developed mechanisms to inactivate oxygen, among these mechanisms we find the use of electrons and protons that react with oxygen producing water. Interestingly, electrons can be obtained as waste products of the metabolism of organic compounds. The sophistication of the metabolism of sugars in the so-called Krebs cycle together with a complex electron transport system allowed to make the most of the energy of organic compounds.

5-The eukaryotic cell

The complexity of the appearance of life is the only fact comparable to the appearance of the eukaryotic cell. It has been speculated that eukaryotes come from the symbiosis of several bacterial types, a hypothesis that gains strength with genetic analysis. In any case the appearance of cells with defined nuclei and organelles is a great black box. One of the most interesting evolutionary processes that we have left to decipher. The breakthrough of the eukaryotic cell can be described with something as simple as compartmentalization. Everything in its corner. Many cellular chemical reactions require a very specific environment incompatible with other reactions.

6-Cell specialization

The favorite son. A cell divides in two but does not leave the same in each daughter cell: one contains more waste than another, different concentration of proteins or is missing some component. These could have been the antecedents of cell specialization. It currently occurs in bacteria, yeasts or some unicellular algae and in some cases live in colonies, where some individualsin certain functions depending on their location within the colony. Specialization means greater efficiency. From there even cells such as neurons or white blood cells would still be a long way off.

7-Sexual reproduction

What would become of us without sex! It has been suggested that sexual reproduction allows rapid adaptation of species by rapidly eliminating pernicious mutations and spreading beneficial ones.  Its appearance could be related to viruses and other parasites or as a collateral result of the strategy of duplicating the genome to reduce the effects of mutations. In any case, living beings with sexual reproduction have diversified and acquired a complexity that no asexual being can overcome.

8-Embryonic development

"Nothing that happens to you in life will mark you as much as gastrulation." The instructions to form a body in a progressive and orderly way meant the jump between a world of jellyfish and worms to the current one. Instructions that are grouped into blocks or genetic packages that allow great adaptability. A step to highlight in embryonic development is gastrulation, which consists of the invagination of a layer of cells from the embryo. Thus, at first glance it does not seem so important, but its appearance meant the specialization in 3D, as it happens in most animals like us compared to the specialization in 2D that occurs in worms.


Stages of human embryonic development

 

9-The nervous system and the brain

Long before the appearance of the nervous system, cells communicated only through contacts with their neighboring cell and the emission of signals, such as hormones. In my opinion the leap is not so much in the formation of a network to get the signals faster but in a centralization of the signals, which in the long term would mean the appearance of the brain. The study of neural networks has advanced considerably in recent years thanks to studies in several model animals, especially in the C-worm.  They chooses, the one we know the network formed by its 302 neurons.

10-The perception of the individual

Until a few years ago it was believed that only higher primates had this ability. However, several studies show that other mammals  such as the elephant or the dolphin, and even birds such as the magpie have this ability. It has been speculated that this capacity is the precursor to the emergence of what we call the self and rational thought. While the latter would deserve a whole scale by itself.

 

Annex 13. Quantum mechanics in biological processes

The Quantum Mechanics of Photosynthesis:

http://neofronteras.com/?p=3012

They discover surprising and fascinating quantum mechanical mechanisms  that occur during part of photosynthesis. It seems that an algae invented quantum computing 2 billion years before humans.


If someone tells us that during photosynthesis Quantum Mechanics is used, we should not be surprised in the least. After all, the photoelectric cell of the elevator or the solar panels on the roof (if they have any) work under the same principles. The explanation for the photoelectric effect is already 105 years old, was given by Albert Einstein and therefore received the Nobel Prize in Physics. Everyone knows, therefore, that quantum mechanics must play an essential role in photosynthesis, but the details of the process were unknown.

When one studies Quantum Mechanics (MC) for the first time one is a little disappointed, because its introduction is usually phenomenological. One expects to see Schrödinger cats and instead sees, at most, quanta of energy and levels in the hydrogen atom or in the square well. That is, the most that is usually reached is the Schrödinger equation.
The most fantastic and surprising usually comes after and for this you need a good mathematical scaffolding based on Hilbert spaces. It is then that the bases of the CM are seen, its postulates, the preparation of states, the superposition of them, the collapse of the wave function, the EPR paradox and, of course, Schrödinger's cat.
Doing experiments to study these details of the MC is very difficult, normally everything goes to ruin with the slightest disturbance, so sometimes you have to cool the system to study to temperatures close to absolute zero, at which time all vibration ceases. That is why it is so difficult to get the famous quantum computer. Having a particle in a superposition of a pair of states is quite an achievement. Well, apparently plants have been doing this for billions of years.
A team from the University of Toronto has made a major contribution to the field of Quantum Biology by observing very special quantum states during the photosynthesis of seaweeds. Another Australian team has reached similar results.
According to Greg Scholes, leader of the Canadian project, their experiments show that biological systems have the ability to use CM to optimize essential processes such as photosynthesis.
Photosynthesis uses different biochemical systems. In a first step are the "antennas" or complexes that capture light and take it to the reaction centers where other processes take place that finally give rise to chemical energy usable by the plant. When a photon reaches one of these antennas they transfer their energy to the electrons in it, but this energy can be lost if it is not quickly transferred to another molecule.
In  the algae Chroomonas  CCMP270, for example, these antennas have 8 molecules of pigments woven into a larger protein structure, and each pigment absorbs light from a different frequency range (color) of the electromagnetic spectrum. The path along these molecules is important because the longer the journey the more energy losses can occur. From a classical point of view the energy should travel by a random path through them. Therefore, the researchers expected that the energy of a laser pulse would not be transferred from the antenna to the reaction centers efficiently and some would be lost.
This team of researchers isolated these antennas or light-gathering complexes from two different species of seaweed and studied their operation at room temperature (at 21 degrees Celsius) thanks to two-dimensional electron spectroscopy. To do this, they used a femtosecond laser with which they illuminated these complexes and thus mimic the natural process of light absorption.
The pulse of this type of laser is so short that the processes that occur after illumination can be monitored more easily without the interference of the beam that illuminated, although those processes are very fast. Among the phenomena that can be observed is the movement of energy by special molecules that are attached to a protein.
By exciting with the laser pulse, the electrons of the pigment molecules jump to an excited state. When they return to their ground states, photons with slightly different wavelengths are emitted and combine to form a certain interference pattern. By studying this pattern, scientists were able to study the state of superposition being created.
The researchers were surprised to clearly observe the long-term survival (four times longer than expected) of quantum-mechanical states related to that energy movement. This time (400 femtoseconds or 4 × 10-13 s)   is long enough for the absorbed photon energy to rehearse all possible paths (remember Feyman's path integral?) along the antenna, allowing it to travel losslessly. For a while the absorbed light energy resides in several places at once. That is, there is a coherent superposition of quantum states. In essence, the antenna performs quantum computing to determine the best way to transfer energy. The discovery goes against the assumed idea that quantum coherence can only occur at very low temperatures near absolute zero, because ambient heat can destroy it. It is unknown how this photosynthetic system manages to perform this feat, but it is speculated that it may be due to the structure of the protein itself.
According to Scholes, this result could mean that quantum-mechanical probability laws prevail over classical laws in complex biological systems, even at normal temperature. Energy can then flow efficiently under the classical perspective in a counterintuitive way and simultaneously traverse several alternative pathways through proteins. In other words, the collection complexes convert light into a wave that travels from the antenna to the reaction centers without loss of energy.
Scholes wonders whether organisms developed this quantum-mechanical strategy of capturing solar energy as an adaptive advantage. According to him it is as if the algae "knew" Quantum Mechanics 2000 million years before humans. The question that remains to be resolved is obvious: do these kinds of quantum-mechanical phenomena occur in other biological processes?

Paul Davies, director of the Arizona-based BEYOND Center for  Fundamental Concepts  in Science,   believes that nature has had billions of years to evolve by taking advantage of quantum advantages, and that it probably exploits them efficiently when it can. He suspects that the workings of many nanoscale biological structures can only be  fully understood with references to coherence, tunneling, entanglement, and other nontrivial quantum processes. The challenge will be to identify such processes in the cell's noisy environment. 

 

Annex 14. Quantum tunneling to achieve nuclear fusion in the Sun

Chemistry of the Sun

https://triplenlace.com/2014/01/16/la-quimica-del-sol/

At 8 minutes and 19 light-seconds is our sun. When we observe our star appear on the horizon between indigo mists and soft reds, it has been 8 minutes and 19 seconds since the sun was in that position. It is located no less than 150 million kilometers from Earth. And thank goodness because it is a powerful chemical reactor.

From the sun we know that its diameter is 109 times that of the Earth, specifically 1,400,000 km; Three-quarters is composed of hydrogen, one-quarter is helium and less than 2% is made up of oxygen, carbon, nitrogen, silicon, neon, iron and sulfur. The temperature on its surface is 5,000 degrees Celsius while at its core it reaches the astronomical figure (never better said) of 15 million degrees Celsius. But what chemical reaction achieves such exuberant results? Nuclear fusion.

Nuclear fusion in the sun involves the transformation of two light atoms into a heavier atom. Those light atoms are the fuel of the reaction and turn out to be isotopes of hydrogen. Hydrogen is the simplest of the chemical elements, it has a proton in its nucleus and an electron spinning around. However, isotopes are also present in nature; From time to time the proton of the nucleus of the hydrogen atom appears accompanied by neutral particles: neutrons. When a neutron accompanies the hydrogen proton in the nucleus we have a deuterium atom,  2 H or D; when two neutrons are added to the hydrogen proton we have another isotope, tritium, 3H.


These two isotopes of hydrogen are the key atoms of the nuclear fusion reaction. When a deuterium atom meets a tritium atom and they fuse together in a supershock they leave behind a new atom containing in its nucleus two protons and two neutrons: a helium atom, He. But if you have done the head-on accounts, you will have noticed that in this balance of matter, we have one neutron left over.

12 H + 1 3H → 24He + 01n

Indeed, that excess neutron is fired after the collision with the speed of light, transforming its mass into energy according to Einstein's famous equation:

E = mc2

Where E is energy, m is the mass of the particle and c is the speed of light. For every mole of hydrogen that reacts, 180 GJ (gigajoules) are released!

 

Now, seen this way it does not seem that this reaction has much complication and being able to control it would free us from our dependence on fossil fuels such as gasoline or natural gas and that is what scientists who investigate cold fusion  are working on. But why "cold"? Let's go back to the sun. The conditions in which this reaction takes place do not occur easily. First, the sun is a mass of plasma. Plasma is a state of matter at very high temperatures in which the mass of its surface is less dense and much denser in its core. The high temperatures to which the atoms in the plasma are subjected cause them to lose their electrons turning it into a kind of ionized gas.

Therefore, in those conditions we have a ball of nuclei that move and collide with each other and that the closer they are to the plasma nucleus they reach more temperature and more density. That is, they move more (have more kinetic energy) and are closer to each other at extreme pressures. At the highest point of temperature and density, the nuclei reach a speed close to that of light. However, although all this sounds favorable for a nuclear fusion shock, there is also another powerful force that is unfavorable:  the repulsive force between the protons, since they have positive charges and repel each other. Sometimes these forces of repulsion can be infinite. The question is to resolve at what point the kinetic energy and density are sufficient to overcome that repulsion, for which we must resort to what is known in physics as quantum tunneling or penetration barrier.

 

This effect of quantum mechanics takes advantage of the wave-particle duality of matter at subatomic levels and predicts that for a particle that is confined in a room with infinitely high walls, and therefore can never overcome them with its associated wave function, can nevertheless pass through the wall as if it were a ghost. The Schrödinger equation  can make a prediction about the probability that this particle has of leaving its confinement by "passing" through the wall thanks to the fact that it has a wave function that varies smoothly within the region near the wall and recovers the oscillating wave aspect when it leaves it. This is possible for light particles passing through barriers or "walls" of small thickness, such as hydrogen isotopes overcoming the energy barrier of their own repulsion.

Intense research in the field of cold fusion is aimed at achieving this thermonuclear reaction at room temperature. Fuel in the form of light particles such as hydrogen isotopes are readily available and would become an inexhaustible source of energy. Controlled cold nuclear fusion is undoubtedly one of the greatest energy challenges facing modern science. Actually, it is: the star challenge.

 

Annex 15. Is our consciousness quantum?

Confirmation of Quantum Resonance in the Microtubules of the Brain

https://es.resonancescience.org/blog/confirmacion-de-la-resonancia-cuantica-en-los-microtubulos-del-cerebro

 


 

Biomolecules exhibit quantum mechanical behavior.

A research team led by Anirban Bandyopadhyay – a preeminent researcher in the science of quantum biology – has demonstrated the existence of quantum mechanical vibrations at high temperature in the neurons of the brain. The research, carried out at the National Institute of Materials Science in Tsukuba (Japan), discovered how the high-frequency oscillation of microtubules – measured in this case at one million cycles per second (one megahertz – 1MHz of oscillation of the electric dipole moments of free electrons and conformational change),  They cause wave interference that can give rise to the characteristic shape of the brain's electrical oscillations that correlate with consciousness, namely a new type of 40 Hz / 4 Hz electroencephalographic (EEG) signal from nested gestalts (gamma and delta oscillations, respectively), called "beat frequencies".

Gamma frequencies have been correlated with consciousness, apparently through the action of neural synchronization, and the periodic wave structure of gamma-delta "beat frequencies" is very reminiscent of the alternating interference bands of quanta that occur in double-slit experiments. Thus, it seems that the brain synchronization of consciousness is linked to the underlying quantum mechanical behaviors of microtubules. With these quantum vibrations, microtubules can be entangled in neural networks through interconnecting channels, called gap junctions, that physically bind neurons together. This is the theory of consciousness developed and defended by University of Arizona quantum biologist and chief anesthesiologist Stuart Hameroff and Oxford University professor emeritus of mathematics, physicist Roger Penrose.

The latest findings strongly support his model that quantum mechanics within the brain engenders consciousness, which has received passionate criticism from academics since its inception in the 1980s, as is typical of any revolutionary paradigm.

The role of water in the brain

Anirban Bandyopadhyay and his research team have conducted experiments that indicate the central importance of water in information processing operations within the brain and body. In their paper: The Atomic Water Channel Controlling the Remarkable Properties of a Single Brain Microtubule, the research team reported experimentation with highly ordered water within the cylindrical cavity of the microtubule lumen. They found that when water was evacuated from the central chamber, the microtubule ceased to show a strong correlation in the macromolecular set of tubulin subunits.

This suggests that water plays a central role in coordinating the behavior of the microtubule's multiple subunits and that, in effect, it functions as a single molecule, a highly quantum effect. Water, as physicist Nassim Haramein and the RSF research team have suggested, is part of the far-reaching coherence and orchestration of cellular information processes correlated with consciousness [1].

[1]  See the section  "The role of ordered water in the coherence and transmission of information within the biological system" in Unified Spacememory Network; Haramein et al., 2017.

Observations of anesthesia

In addition, research conducted at the University of Pennsylvania, led by Roderick G. Eckenhoff, suggests that anesthetic compounds act in part by disrupting the normal function of microtubules, apparently dispersing the electrical dipoles necessary for consciousness. It was the anesthesiological studies of Stuart Hameroff in the 70s that led him to suggest a role of microtubules in the generation of consciousness, after observing changes in the dynamics of microtubules when exposed to anesthetic compounds. If there is a molecule that stops consciousness, then seeing what specific changes occur in the cellular environment when exposed to that compound will be an important clue to what structures are involved in generating consciousness.

Hameroff's revolutionary idea was to take the theoretical mechanisms of consciousness from the cellular-synaptic level to the nanometer scale of large biomolecular networks, where quantum mechanical behaviors could occur (following in the wake of Herbert Fröhlich, who had proposed that long polymeric biomolecules could achieve quantum coherent solution waves by pumping metabolic energy, which resulted in a non-local entanglement (which was later called Fröhlich condensates).

A new kind of physics

One of the key features of Hameroff and Penrose's theory is called Orchestrated Objective Reduction (Orch-OR), in which it is theorized that the state vector (the wave function describing a particle) of the free electrons delocalized within the tubulin undergoes an observer-independent reduction (an objective versus subjective collapse of the wave function). As the electron displays more and more nonlocal attributes, which is known as superposition, the geometry of the underlying spacetime bifurcates, and the degree of separation between the "bubbles" of spacetime—measured in Planck lengths—reaches a critical distance, at which point the geometry of spacetime becomes unstable and collapses.

This mechanism is known as the Diósi-Penrose criterion of gravity-induced quantum collapse. Each of these bifurcations and collapses represents an indeterminable quantum computation, and the coordination of a multitude of these events through quantum entanglement (the orchestrated part of OR) allows for massively parallel quantum computations within the brain. As Hameroff and Penrose suggest, this is what produces consciousness. Since the reduction of the state vector is entirely due to this stochastic mechanism, and is therefore indeterminate, it confers on consciousness a characteristic of unpredictability.

The USN and Haramein Escalation Law

Just as the Diósi-Penrose criterion of gravity-induced quantum collapse is mediated by a quantum geometry of underlying spacetime, Haramein et alii describe an underlying spacetime geometry in the paper The Unified Spacemory Network. Unlike the Diósi-Penrose mechanism, the quantum geometry of spacetime of the unified spacetime network does not involve overlaps, but strong entanglement through the underlying Planckian spacetime microhole network. In addition to microtubules, the authors highlight the importance of structures such as atomically ordered water and cell system membranes.

Microtubules are really remarkable macromolecular structures of the biological system, so it is not surprising that several researchers have become interested in them. In the paper Scale Unification, Haramein and Rauscher, together with biologist Michael Hyson, present their findings on a universal-scale law for organized matter. There are a number of organized systems of matter that obey the Schwarzschild condition of a black hole, and when plotted on a graph of frequency versus radius, a trend line emerges, in which structures from cosmological to subatomic size show a definite scale relationship. The surprising thing is that the microtubules are located in the center of the trend line, occupying the position between the ultra large and the ultra small, the macrocosm and the microcosm.

"Interestingly, the microtubules of eukaryotic cells, which have a typical length of 2 X 10-8 cm and an estimated vibrational frequency of 10 9 to  10 14 Hz, are quite close to the line specified by the law of scaling and intermediate between the stellar and atomic scales" - Haramein et al,  Scale Unification, 2008

The fractal collector

According to this finding, microtubules may have a harmonic relationship with the polarizable structures of the quantum vacuum (which show that it is in a Ф (phi) ratio! A fractal-like scale relationship). John Wheeler first described these fluctuating vacuum structures as Planck mini black holes. Similarly, Haramein shows how vacuum oscillators can in fact be white hole/black hole systems. Thus, while the Diósi-Penrose criterion uses a bifurcated "bubble" geometry of spacetime, the Haramein solution shows how the action of polarized white hole/black hole spacetime structures can be, whose oscillation functions as a computational element in analogy with the gravity-induced collapse of the Hameroff-Penrose mechanism.

"The universality of this scaling law suggests an underlying polarizable structured vacuum of mini white holes/black holes." -Ibid

In addition, Haramein describes a fractal multiple structure of spacetime, far from the smooth, flat spacetime architecture envisioned by the Standard Model. This is very pertinent to the nature of consciousness, because fractal systems are produced by/and underlie chaos dynamics.

One of the key characteristics of chaotic systems is that they can be extremely sensitive to even small changes, due to the nonlinear interactions that result from feedback operations and the high overall coherence within the system. As such, there is an indeterminate nature to fractal/chaotic systems, such as when trying to predict time. So, unlike the objective reduction mechanism proposed by Hameroff and Penrose, the chaotic dynamics of quantum vacuum foam fluctuations could be the source of the apparent unpredictability and self-will so characteristic of our consciousness (note that in technical semantics, chaos does not mean disorder, but quite the opposite.  it only involves certain key characteristics, such as a degree of unpredictability.)

Between a rock and a hard place? Find the middle way.

As more and more nonlocal quantum mechanical phenomena are discovered within the biological system, Hameroff and Penrose's theory (as well as that of other researchers investigating this new frontier of science) is accumulating tangible empirical evidence, so that models of quantum consciousness are moving from being beautiful theoretical constructs to becoming demonstrable facts. What is remarkable about Hameroff's model of consciousness, as well as Haramein's, is that they find the middle ground between two extremes: the spiritual/metaphysical perspective on the one hand, in which consciousness is primary and cannot be explained scientifically; and on the other hand the scientific/materialist perspective, in which consciousness is an illusory epiphenomenological state that emerges from the complexity of neurons and plays no role in the dynamics of the Universe in general. Instead, what we call consciousness can not only arise from the dynamics of discrete physical events of the quantum collector of spacetime, but also plays an intrinsic role in the ordering and dynamics of the Universe.

 

Annex 16. String theory and supersymmetry

String theory

http://www.nocierreslosojos.com/teoria-cuerdas/

  • Key Figure: Leonard Susskind (b. 1940)
  • Before:
  • 1914 The idea of a fifth dimension is proposed to explain how gravity works together with electromagnetism.
  • 1926 Swedish physicist Oscar Klein develops ideas about unobservable extra dimensions.
  • 1961 A theory is devised to unify electromagnetism and the weak nuclear force.
  • After:
  • 1975 Abraham Pais and Sam Treiman coin the term "standard model".
  • 1995 Edward Witten, American physicist, develops M-theory, which includes 11 dimensions.
  • 2012 Large Hadron Collider detects Higgs boson.

Leonard Susskind

 

Born in New York (USA) in 1940, Leonard Susskind holds the Felix Bloch Chair of Physics at Stanford University in California. He received his Ph.D. from Cornell University (New York) in 1965 and joined Stanford University in 1979.

In 1969 he published the theory for which he is known: string theory. His mathematical work showed that particle physics could be explained by vibrating strings at the smallest level. In the 1970s he further developed that idea, and in 2003 he coined the term "string theory landscape." This radical notion was intended to highlight the large number of possible universes that would make up an incredible "megaverse" with, perhaps, other universes with the necessary conditions for life.  Susskind is today a highly respected figure in his field.

  • Main works:
  • 2005 The cosmic landscape.
  • 2008 The War of Black Holes.
  • 2013 The theoretical minimum.

Particle physics

Particle physicists use the so-called "Standard Model" theory to explain the universe. Developed in the 1960s and 1970s, that model describes the particles and fundamental forces of nature that make up the universe and hold it together. One problem with the Standard Model is that it does not fit with Einstein's theory of general relativity, which relates gravity (one of the four forces) and the structure of  space and time and treats them as a four-dimensional entity ("space-time"). The Standard Model does not fit the curvature of space-time advocated by general relativity.

Quantum mechanics, on the other hand, explains how particles interact at the smallest levels (at the atomic scale), but does not account for gravity. Attempts have been made in vain to unite the two theories; For now, the Standard Model can only explain three of the four fundamental forces.

Particles and forces

In particle physics, atoms are made up of a nucleus of protons and neutrons, surrounded by electrons. The electron and quarks that make up protons and neutrons are among the 12 fermions (particles of matter): the elementary or fundamental particles that are the smallest known building blocks of the universe. Fermions are subdivided into quarks and leptons. Next to those fermions, there are bosons (force-carrying particles) and the four forces of nature: electromagnetism, gravity, strong force and weak force. Different bosons are responsible for carrying the different forces between the fermions.

The Standard Model describes what is known as the Higgs field, an energy field believed to permeate the entire universe. The interaction of particles in the Higgs field gives them their mass; and a measurable boson called the Higgs boson is the force carrier for the Higgs field. Now, none of the known bosons is the carrier of the force of gravity; This has led to the postulation of a hypothetical particle, not yet detected, called a graviton.

 

String theory

In 1969, in an attempt to explain the strong nuclear force, which binds protons and neutrons together within the atomic nucleus, the American Leonard Susskind developed the idea of string theory. Japanese-American Yoichiro Nambu and Danish Holger Nielsen conceived the same idea at the same time. According to string theory, particles (the building blocks of the universe) are not like dots, but rather something like tiny, one-dimensional vibrating threads of energy, or strings, that give rise to all forces and matter. When the strings collide, they combine and vibrate together briefly before separating again.

Early models of string theory were problematic. They explained bosons but not fermions, and needed certain hypothetical particles, called tachyons, to travel faster than light. They also needed many more dimensions than the four known ones of space and time.

 

According to string theory, elementary particles (such as electrons and quarks, which make up protons and neutrons) are strings or filaments of energy. Each string vibrates with a different frequency, and those vibrations correspond to the speed, spin, and charge of the particles.

Supersymmetry

To circumvent some of these early problems, the principle of supersymmetry was devised, which proposes that the universe is symmetrical and provides each of the known particles of the Standard Model with an undetected companion, or "superpartner"; So, for example, each fermion is paired with a boson, and vice versa.

 

According to supersymmetry, every boson (force-carrying particle) has as a massive "superpartner" a fermion (matter particle), and every fermion has a boson. Superstring theory describes supercompanion particles as strings that vibrate in higher octaves. According to some theorists, supercompanions may have masses up to a thousand times greater than those of their corresponding particles, but no supersymmetric particles have yet been found.

When the Higgs boson, predicted in 1964 by Britain's Peter Higgs, was detected in 2012 by CERN's Large Hadron Collider, it turned out to be lighter than expected. Particle physicists believed it would be heavier because of its interactions in the Higgs field with Standard Model particles, to which it gave mass. But it wasn't. The idea of supercompanions, particles capable of potentially nullifying some of the effects of the Higgs field and producing a lighter Higgs boson, allowed scientists to address that problem. It also allowed them to discover that three of the four forces of nature (i.e., electromagnetism, strong force, and weak force) may have existed with the same energies at the Big Bang, a crucial step toward unifying those forces into a Unified Grand Theory.

Superstring theory

Together, string theory and supersymmetry gave rise to superstring theory, in which all fermions and bosons and their supercompanion particles are the result of vibrating strings of energy. In the 1980s, American John Schwarz and British Michael Green developed the idea that elementary particles such as electrons and quarks are the outer manifestations of vibrating "strings" on the scale of quantum gravity.

Just as different vibrations of a violin's string produce different notes, properties such as mass are the result of different vibrations of the same type of string. An electron is a segment of string that vibrates in a certain way, while a quark is an identical segment of string that vibrates in a different way. Schwarz and Green observed that string theory predicted a massless particle similar to the hypothetical graviton. The existence of such a particle could explain why gravity is so weak compared to the other three forces, since gravitons would enter and leave the approximate ten dimensions required by string theory. Thus, at last something that Einstein sought for a long time appeared, a theory capable of describing everything in the universe, a "theory of everything".

A unifying theory

Physicists in search of an all-encompassing theory encounter problems when confronted with black holes, where the theory of general relativity joins quantum mechanics in trying to explain what happens when an immense amount of matter is compressed into a very small space. According to general relativity, one could say that the core of a black hole (its singularity) is essentially zero in size. However, according to quantum mechanics, that is impossible because nothing can be infinitely small. According to the uncertainty principle conceived by the German Werner Heisenberg in 1927, it is not possible to reach infinitely small levels because a particle can always exist in multiple states. Fundamental quantum theories such as superposition and entanglement also determine that particles can be in two states at once. They have to produce a gravitational field, which would be consistent with general relativity, but this does not seem to be the case according to quantum theory.

If superstring theory could solve some of those problems, it would become the unifying theory physicists are looking for. It would be possible to demonstrate it by colliding particles. Some scientists believe that, at higher energies, gravitons may be seen dissolving in other dimensions, which would be a fundamental test in favor of the theory.

Untangling the idea

 


The walls of the Super-Kamiokande neutrino observatory are covered with photomultipliers to detect the light emitted by neutrinos interacting with the water in the tank.

Some scientists, such as the American Sheldon Glashow, believe that string theory research is useless because no one will ever be able to prove whether the strings it describes exist. They deal with energies so high (beyond the measurement called Planck energy) that it is impossible for us to detect them, and may remain impossible in the immediate future. Our inability to design an experiment that tests string theory led some scientists like Glashow to wonder if it is actually a scientific theory. There are those who disagree and point out that experiments are underway trying to find some of those effects and provide an answer. The Super-Kamiokande experiment in Japan, for example, could demonstrate aspects of string theory by studying proton decay (the theorized decay of a proton over extremely long timescales), a phenomenon predicted by supersymmetry.

Superstring theory can explain much of the unknown universe – for example, why the Higgs boson is so light and why gravity is so weak – and perhaps it can explain the nature of dark energy and dark matter. Some scientists even believe that string theory could provide information about the fate of the universe, and whether it will continue to expand indefinitely.

 

Annex 17. Primordial black holes, MACHOs and WIMMP's

Primordial Black Holes Could Explain Dark Matter, Galaxy Growth and More

https://es.knowablemagazine.org/article/physical-world/2022/agujeros-negros-primordiales

One day just over five years ago, Ely Kovetz was having lunch with his colleagues at Johns Hopkins University in Baltimore and discussing a tantalizing rumor. Like many in the physics community, Kovetz had heard the rumor about a possible signal from a newly put into operation American physics observatory. The observatory was designed to pick up perturbations in the fabric of space-time, ripples created, among other things, by black holes colliding with each other. Most intriguingly, the signal appeared to have been created by massive objects, much heavier than expected. That pointed to some surprising possibilities.

"The first thing everyone thought was, 'What? This cannot be. This is impossible,'" recalls Kovetz, a physicist at Israel's Ben-Gurion University and a visiting professor at Johns Hopkins. But then a more exciting suspicion began to arise. Perhaps, they thought, this could be a sign of primordial black holes.

Black holes since the beginning of time! It sounds like the title of a low-budget science fiction movie, but fractions of a second after our universe was born, a swarm of voracious black holes could have formed spontaneously from the fiery energy that permeated the cosmos. Supported by mathematics and theory, but never definitively observed, these primordial black holes are a possibility that has fascinated physicists for nearly half a century, gaining or losing popularity as new observations seemed to support or exclude their existence.

The puzzling 2015 signals from the U.S. Laser Interferometer Gravitational-Wave Observatory (LIGO), and dozens of other detections by the observatory and its European counterpart, Virgo, have fueled renewed interest in the idea, with hundreds of papers published about them in just the past five years.

Primordial black holes, if they existed, would be massive entities that do not emit light, so they would be invisible. Since they would be scattered throughout the universe, they could help make sense of a wide variety of bizarre observations that have so far defied explanation. One of the main reasons researchers are drawn to these strange black holes is that they could solve one of astrophysics' biggest and most enigmatic mysteries: the identity of dark matter.

Even if they can't detect it, physicists know that dark matter exists because its gravitational effects are seen throughout the cosmos. But no one knows what it's made of. Primordial massive black holes could be the long-sought answer. These large, heavy objects could also have served as anchors around which the first galaxies coalesced, another conundrum that has long resisted explanation.

Although skepticism remains, true believers eagerly await new telescope projects and sky surveys that could finally bring these captivating beasts from the sphere of speculation into the realm of reality.

Several galaxies collide with each other in the famous Bullet Cluster, leaving clusters of hot gas (shown in pink) and an even greater amount of dark matter (shown in blue). Some physicists believe that primordial black holes could make up a significant fraction of the dark matter in the universe.

CREDIT: NASA HST/CXC/MAGELLAN

Of MACHOs and WIMPs

Ordinary black holes arise from death. When a large star reaches the end of its life, it explodes in a spectacular supernova. The star's heavy core, which can weigh at least several times the mass of the Sun, collapses into a compact object so dense that not even light can escape its gravitational pull. A black hole is born.

In the seventies, the brilliant physicist Stephen Hawking and his PhD student Bernard Carr proposed another possible creation path for black holes. It was known that, shortly after  the big bang, the universe was filled with a thick soup of radiation and fundamental particles such as quarks and gluons, the building blocks of protons and neutrons. Natural variations in density in the soup would have left some regions with more material and others with less. Hawking and Carr's equations showed that areas with enough radiation and particles packed into them could have collapsed in on themselves and formed black holes with a wide range of possible sizes.

This idea was shelved, but dusted off in the nineties, when the debate about what might constitute dark matter began to heat up. The enigmatic substance has been seen gravitationally tugging at stars and galaxies and spinning them much faster than expected. Observations suggest that this invisible dark matter is so ubiquitous that it outnumbers the matter we can see in the cosmos by more than five to one.

One camp favored the explanation that dark matter was made up of compact objects, including black holes—with a large number of primordial black holes since the beginning of time to help explain the vast amount of dark matter—which were given the acronym Massive Astrophysical Halo Compact Objects (MACHO, for its acronym in English). Rival scientists preferred the perspective known as Weakly Interacting Massive Particles (WIMPs), hitherto undetected subatomic particles that could exert a gravitational pull while remaining invisible.

The gravity of a massive red galaxy magnifies and distorts the light from a distant and ancient galaxy behind it, forming a blue ring-shaped object known as the Cosmic Horseshoe. These fortuitous alignments create a lensing effect that could allow astronomers to detect evidence of primordial black holes drifting in space.

CREDIT: ESA/HUBBLE AND NASA

According to the laws of physics, MACHOs would warp space-time around them, forming lens-like regions that would create observable distortions. When light from distant stars passes through these lenses, ground-based telescopes should see how the stars light up briefly. However, when astronomers looked for those flashes, they found few cases that could be attributed to MACHOs, leading most physicists to focus on the idea that dark matter is made up of WIMPs.

But some researchers never quite gave up hope that black holes had any role in dark matter. Among them is Carr, now at Queen Mary University of London in the United Kingdom, who co-authored a recent paper on primordial black holes in the journal Annual Review of Nuclear and Particle Science. "Primordial black holes are ideal candidates," he says. "We do know that black holes exist. We are not invoking some particle that we currently have no proof of." 

Mysterious noises at night

Over the intervening decades, the search for WIMPs has so far yielded no results, though not for lack of trying. Huge detectors dedicated to discovering its existence have seen nothing. And the powerful Large Hadron Collider particle accelerator near Geneva has found no hint of unexpected new subatomic entities. Consequently, some researchers had already moved away from the idea of WIMPs when the new LIGO signals were detected, sparking rumors and refocusing attention on MACHO black holes.

The signals detected by LIGO in 2015 were confirmed to be squeaks coming from a huge collision between two black holes, each weighing about 30 solar masses. The objects were strangely bulky — so large that if they had been created by collapsing stars, they would have had masses up to 100 times that of our Sun. These beasts should be quite rare in the universe, Kovetz says, so either LIGO was lucky with its first detection and detected a very unusual event, or there are more giant black holes than physicists would expect if collapsing stars were their sole origin. After the discovery was announced the following year, three different teams proposed that these objects had not been born from stars, but at the dawn of time, before they existed.

"When I wrote this article... I was hoping someone would give some reason why it definitely couldn't be true," says Simeon Bird, a cosmologist at the University of California, Riverside, whose paper, co-authored with Kovetz and others, was the first to come to light. Instead, LIGO continued to pick up additional signals from other black holes in this immense mass range, triggering exciting activity among theoretical physicists that has yet to subside.

If primordial black holes exist, some researchers think they could cluster in clusters with a few heavy entities surrounded by many lighter ones, as illustrated here. New telescopes are on the hunt for signals from such putative black hole arrays.

CREDIT: INGRID BOURGAULT / WIKIMEDIA COMMONS

The new signals come at a time when our understanding of the scorching conditions immediately after the big bang — when primordial black holes would have formed — has been vastly improved by new theoretical models. A recent study by Carr and  others suggests that, about a millionth of a second after  the big bang, the expansion of space-time would have caused a drop in temperature and pressure that could have aligned appropriately to produce relatively small black holes with masses similar to that of the Sun. Soon after, conditions changed to favor the appearance of large black holes, with about 30 solar masses. 

The models also suggest that, throughout cosmic history, these various primordial black holes could have found each other. Attracted by gravity, the black holes could have formed clusters, with multiple smaller objects revolving around a central giant black hole, much like electrons typically orbit around an atomic nucleus.

This could explain why the MACHO hunters of the nineties never saw enough objects to account for dark matter: they only looked for gravitational lensing created by the smallest types of black holes. The lenses of smaller objects would be more compact and, floating around the galaxy, would take less than a year to pass in front of the stars, causing their light to brighten and then dim relatively quickly. If black holes were found in clusters, the much greater gravitational warping of space-time would take longer to pass in front of a distant star — several years or even decades.

Search for galaxies

About 15 seconds after the big bang, another type of black hole could have emerged. According to current calculations, these black holes would weigh a million times the mass of the Sun, large enough to potentially explain the origin of galaxies.

Telescopes have detected fairly developed galaxies at great distances, meaning they formed fairly early in cosmic history. This is puzzling, since galaxies are huge structures and, at least in computer simulations, take a long time to form from the slow, heavy swirls of gas and dust found throughout the cosmos. But this is the best explanation for its formation that astronomers have found so far.

Primordial black holes may provide an easier route. Since almost all galaxies contain a huge black hole at the center, it seems possible that these gravitational goliaths acted as starting points, helping to pull material toward the first protogalaxies at a fairly early stage in cosmic history. As the universe progressed, these small galaxies would have gravitationally attracted each other, then collided and merged into the much larger galaxies seen today.

Carr and his colleagues have begun to consider the possibility that primordial black holes are much more widespread than suspected. In theory, conditions that occurred shortly after the big bang could have produced even smaller, planet-scale black holes with masses about 10 times that of Earth. In fact, studies have detected tiny gravitational lenses floating around the galaxy, which pass in front of stars and cause their light to flicker rapidly. Most astrophysicists have attributed these lenses to large rogue planets that were ejected from their star systems. But not everyone agrees.

Among them is theoretical physicist Juan García-Bellido of the Autonomous University of Madrid, who claims that lenses are caused by primordial black holes. Garcia-Bellido, co-author of Carr's recent paper, remains enthusiastic about the idea of primordial black holes.

The new Vera C. Rubin observatory, which is under construction in Chile and will become operational in late 2023, will be used to scan the night sky for evidence of primordial black holes.

CREDIT: RUBIN OBS / NSF / AURA

But others aren't sure black holes are as prevalent as they should be to explain dark matter. "I think it's unlikely," says cosmologist Anne Green of the University of Nottingham in the United Kingdom. One of the problems with the theory is that the existence of large numbers of multisolar-mass black holes  throughout the cosmos would have all sorts of visible effects that have never been seen. Because these objects consume gas and dust, they should emit large amounts of radio waves and X-rays that could betray their presence, he adds.

As for dark matter, theoretical models of the early universe also require a lot of tweaking so that they yield the right number of black holes that match the amount of dark matter we know exists. "It's pretty hard to make models that produce the right number of black holes," Green says.

Even some of the biggest fans of primordial black holes are no longer so optimistic about the possibility that the types of black holes detected by LIGO could account for all the dark matter in the universe. If many of those black holes were lurking in space, astronomers would have already seen more of their effects, Kovetz says. He still thinks they may contribute something and, in general, that including more primordial black hole sizes beyond what LIGO has detected could add up to enough to explain dark matter. And yet, "personally, I've lost some of my motivation."

The good news is that new instruments could help physicists get to the bottom of the matter very soon. LIGO and Virgo are being upgraded and have now been joined by a Japanese gravitational-wave detector called KAGRA. An Indian instrument will also be launched in the coming years.

Observations from these facilities could finally tip the balance one way or the other. If observatories detect a small black hole of a solar mass or less — something impossible to create from stellar evolution — it would provide exciting and definitive proof of at least one type of primordial black hole, making them a much more attractive explanation for dark matter and galaxy formation.

In addition to looking for very small black holes, scientists could also seal the deal by finding black holes that formed even before stars existed. This may be beyond the capacity of existing observatories, but the European Space Agency is planning to launch in the next decade a new, highly sensitive space probe called the Laser Interferometry Space Antenna (LISA), which could be up to the task.

Garcia-Bellido and others are planning to use another new instrument that is scheduled to become operational in 2023, Chile's Vera C. Rubin Observatory, to search for stars that shine on multi-year timescales, which could be proof of the existence of drifting black hole clusters in the skies. At least a few researchers hope that, within three or four years, they may finally have a real and definitive answer about whether primordial black holes exist or not.

Until then, scientists will be on the edge of their seats, trying to keep an open mind about dark matter. Perhaps the mysterious substance turns out to be made of many things, including both exotic particles and black holes. "The universe is messy and has a lot of stuff in it," Bird says. "I think the universe likes to make things difficult for physicists."

 

Annex 18. Maldacena Conjecture / AdS/CTF Correspondence: The equivalence between string  theory  or supergravity defined  in a certain class of anti-de Sitter space  and a conformal field theory defined at its boundary with dimension less than one.

AdS/CFT correspondence:

https://www.wikiwand.com/es/Correspondencia_AdS/CFT

In theoretical physics, the AdS/CFT correspondence  (anti-de Sitter space/conformal field theory), also called  Maldacena conjecture, Maldacena duality,   or gauge/gravity duality, is a conjectured  relationship between two types of physical theories. On the one hand there are the anti-de Sitter spaces  (AdS) that are used  in quantum gravity theories, formulated in terms of  string theory  or M-theory. On the other side of the correspondence  are conformal field theories  (CFTs) which are  quantum field theories, which include theories similar to Yang-Mills theories that describe elementary particles.

Duality represents a breakthrough in our understanding of string theory and quantum gravity. This is because it provides a nonperturbative formulation  of string theory with certain boundary conditions and because it is the most successful realization of the holographic principle, an idea in quantum gravity originally proposed  by Gerard 't Hooft and promoted by Leonard Susskind.  

In physics,  the AdS/CFT correspondence  is the equivalence between  a string theory  or supergravity defined  on a certain class of anti-de Sitter space  and a conformal field theory defined at its boundary with dimension less by one.

The anti-de Sitter  space (AdS) corresponds to a solution to Einstein's equations with negative cosmological constant, and is a classical theory of gravity; while conformal field theory (CFT) is a quantum theory. This correspondence between a classical theory of gravity and a quantum one may be the path to quantum gravity.

The AdS/CFT correspondence was originally proposed by Argentine physicist Juan Maldacena in  late 1997, and some of its technical properties were soon clarified in a  paper by Edward Witten and  another paper by Gubser, Klebanov and Polyakov. By 2015, Maldacena's paper  had over 10,000 citations, making it the most cited paper in the field of particle physics.

Summary of correspondence

 

A tessellation of the hyperbolic plane by triangles and squares.

The geometry of anti-de Sitter space

In the AdS/CFT correspondence, string theory or M-theory on an anti-de Sitter background is considered. This means that the geometry of spacetime is described in terms of a certain vacuum solution of the Einstein equation called anti-de Sitter.

In very elementary terms, anti-de Sitter space is a mathematical model of spacetime in which the notion of distance between points (the metric) is different from the notion of distance in ordinary Euclidean geometry. It is closely related to hyperbolic space, which can be seen as a disk as illustrated on the right. This image shows a tessellation of a disk by triangles and squares. One can define the distance between the points of this disk in such a way that all the triangles and squares are the same size, and the outer circular boundary is infinitely far from any point inside.

Now imagine a stack of hyperbolic disks where each disk represents the state of the universe at any given time. The resulting geometric object is the three-dimensional anti-de Sitter space. It looks like a solid cylinder in which any cross-section is a copy of the hyperbolic disk. Time runs along the vertical direction in this image. The surface of this cylinder plays an important role in the AdS/CFT correspondence. As with the hyperbolic plane, the anti-de Sitter space is  curved in such a way that any point in the interior is actually infinitely far from this boundary surface.



Anti-de Sitter three-dimensional space  is like a stack of hyperbolic disks, each representing the state of the universe at a given time. The resulting spacetime resembles a solid cylinder.

This construction describes a hypothetical universe with only two spatial dimensions and one temporal dimension but can be generalized to any number of dimensions. In fact, hyperbolic space can have more than two dimensions and one can "stack" copies of hyperbolic space to obtain higher-dimensional models of anti-de Sitter space.

The idea of AdS/CFT

An important feature of anti-de Sitter space is its limit (which resembles a cylinder in the case of three-dimensional anti-de Sitter space). One property of this limit is that, locally around any point, it resembles Minkowski space, the model of spacetime used in non-professional physics.

Therefore, it can be considered an auxiliary theory in which "spacetime" is given by the limit of anti-de Sitter space. This observation is the starting point  for the AdS/CFT correspondence, which states that the limit of the anti-de Sitter space  can be considered as the "spacetime" for a conformal field theory. The claim  is that this conformal field theory is  equivalent to gravitational  theory in bulk anti-de Sitter space in the  sense that there is a "dictionary" for translating calculations in one theory into calculations in the other. Each entity in one theory has a counterpart in the other theory. For example, a single particle in gravitational theory could correspond to some collection of particles in limit theory. Furthermore, the predictions in the two theories are quantitatively identical, so that if two particles have a 40 percent chance of colliding in gravitational theory, then the corresponding collections in the limit theory would also have a 40 percent chance of colliding.

 

A hologram is a two-dimensional image that stores information about the three dimensions of the object it represents. The two images here are photographs of a single hologram taken from different angles.

Note that the limit of anti-de Sitter space has fewer dimensions than the anti-de Sitter space itself. For example, in the three-dimensional example illustrated above, the boundary is a two-dimensional surface. The AdS/CFT correspondence is often described as a "holographic duality", because this relationship between the two theories is similar to the relationship between a three-dimensional object and its image as a hologram. Although a hologram is two-dimensional, it encodes information about the three dimensions of the object it represents. In the same way, theories that are related by the AdS/CFT correspondence are conjectured to be exactly equivalent, despite living in different numbers of dimensions. Conformal field theory  is like a hologram that captures information about higher-dimensional quantum gravity theory.

Examples of correspondence

Following Maldacena's understanding in 1997, theorists have discovered many different embodiments of the AdS/CFT correspondence. These relate  various conformal theories of the  field to the compactifications of string theory and M-theory in various numbers of dimensions. The theories involved are generally not viable models of the real world, but they have certain characteristics, such as their particle content or high degree of symmetry, that make them useful for solving problems in quantum field theory and quantum gravity.

The most famous example of the AdS/CFT correspondence indicates that type IIB string theory in  the product  space 5× 5 is equivalent to  the supersymmetric N=4 Yang-Mills theory  on the four-dimensional limit. In this example, the space-time in which gravitational theory lives is effectively five-dimensional  5), and there are five  additional compact dimensions 5. In the real world, space-time is four-dimensional, at least macroscopically, so this version of correspondence does not provide a realistic model of gravity. Similarly, dual theory is not a viable model of any real-world system, as it assumes a great deal of supersymmetry. However, as explained below, this limit theory shares some features in common with quantum chromodynamics, the fundamental theory of the strong force. It describes gluon-like particles  in quantum chromodynamics along with certain fermions. As a result, it has found applications  in nuclear physics, particularly in the study of quark-gluon plasma.

Another embodiment of the correspondence indicates that the theory M  7× 4 is equivalent to the so-called (2,0) theory in six dimensions. In this example, the spacetime of gravitational theory is effectively seven-dimensional. The existence of theory (2,0) appearing on one side of duality is predicted by the classification of superconformal field theories. It is still poorly understood because it is a quantum mechanical theory without a classical limit. Despite the difficulty inherent in studying this theory, it is considered to be an interesting object for a variety of reasons, both physical and mathematical.

Another embodiment of the correspondence states that M4  ×  7 theory  is equivalent to  ABJM superconformal field theory  in three dimensions. Here gravitational theory has four non-compact dimensions, so this version of correspondence provides a somewhat more realistic description of gravity.

History and development

 

Gerard 't Hooft obtained results related to AdS /CFT correspondence in the 1970s by studying analogies between string theory and nuclear physics.

String theory and nuclear physics

Main article: Second superstring revolution

The discovery of the AdS/CFT correspondence in late 1997 was the culmination of a long history of efforts to relate string theory to nuclear physics. In fact, string theory was originally developed in the late sixties and early seventies as a theory of hadrons, subatomic particles such as the proton and neutron held together by the strong nuclear force. The idea was that each of these particles could be seen as a different mode of oscillation of a chain. In the late sixties, experimentalists had found that hadrons fell into families called Regge trajectories with square energy proportional to angular momentum, and theorists showed that this relationship arises naturally from the physics of a rotating relativistic string.

On the other hand, attempts to model hadrons as strings faced serious problems. One problem is that string theory includes a massless spin-2 particle, while no particle appears in hadron physics. Such a particle would mediate a force with the properties of gravity. In 1974, Joel Scherk and John Schwarz suggested that string theory was therefore not a theory of nuclear physics, as many theorists had thought, but a theory of quantum gravity. At the same time, they noticed that hadrons are actually made from quarks, and the sequence theory approach was abandoned in favor of quantum chromodynamics.

In quantum chromodynamics, quarks have a kind of charge that comes in three varieties called colors. In a 1974 paper, Gerard 't Hooft studied the relationship between string theory and nuclear physics from another point of view by considering theories similar to quantum chromodynamics, where the number of colors is an arbitrary  number, rather than Three. In this article, 't Hooft considered a certain limit where  it tends to infinity and argued that at this limit certain calculations in quantum field theory resemble calculations in string theory.

Black holes and holography

 

Stephen Hawking predicted in 1975 that the black hole emits Hawking radiation due to quantum effects.

Main articles: Paradox of information loss in black holes, Thorne–Hawking–Preskill bet  and Holographic principle

In 1975, Stephen Hawking published a calculation that suggested that black holes are not completely black, as they emit faint radiation due to quantum effects near the event horizon. This work extended on the earlier results of Jacob Bekenstein who had suggested that black holes have a well-defined entropy. At first, Hawking's result seemed to contradict one of the main postulates of quantum mechanics, namely the unitarity of the evolution of time. Intuitively, the unitarity postulate  says that quantum mechanical systems do not destroy information as they evolve from one state to another. For this reason, the apparent contradiction came to be known as the black hole information paradox.

 

 

Leonard Susskind made early contributions to the idea of holography in quantum gravity.

Later, in 1993, Gerard 't Hooft wrote a speculative paper  on quantum gravity in  which he reviewed Hawking's work on  the thermodynamics of black holes, concluding that the total number of degrees of freedom in a region of space-time surrounding a black hole is proportional to the surface area of the horizon. This idea was promoted by Leonard Susskind and is now known as the holographic principle. The holographic principle and  its realization in string theory through AdS/CFT correspondence have helped elucidate the mysteries of black holes suggested by Hawking's work and are believed to provide a resolution of the black hole information paradox. In 2004, Hawking admitted that black holes do not violate quantum mechanics, and suggested a concrete mechanism by which they could preserve information.

Maldacena's work

At the end of 1997, Juan Maldacena published a reference paper that initiated the study of AdS/CFT. According to Alexander Markovich Polyakov, "[Maldacena's] work opened the floodgates." The conjecture immediately aroused great interest in the string theory community and was considered in articles by Steven Gubser, Igor Klebanov and  Alexander Polyakov, and by Edward Witten. These papers made Maldacena's conjecture more accurate and showed that the conformal field theory  that appears in correspondence lives at the limit of anti-de Sitter space.

 

Juan Maldacena first proposed the AdS/CFT correspondence in late 1997.

A special case of Maldacena's proposal  says that the N=4 super-Yang-Mills  theory, a caliber theory  similar in some respects to quantum chromodynamics, is equivalent to string theory  in anti-de Sitter space  in five dimensions. This result helped clarify 't Hooft's earlier work  on the relationship between string theory and quantum chromodynamics, taking string theory back to its roots as a theory of nuclear physics. Maldacena's results  also provided a concrete realization of the holographic principle with important implications for quantum gravity and black hole physics. By 2015, Maldacena's paper  had become the most cited paper in high-energy physics with more than 10,000 citations. These subsequent articles have provided considerable evidence that the correspondence is correct, although so far it has not been rigorously demonstrated.

AdS/CFT finds applications

Main articles: AdS/QCD and AdS/CMT

In 1999, after taking a job at Columbia University, nuclear physicist Đàm Thanh Sơn  paid a visit to Andrei Starinets, a friend from the days of Sơn's student   who happened to do a PhD. In string theory at New York University. Although the two men had no intention of collaborating, Sơn soon realized that the AdS/CFT  calculations Starinets was doing could shed light on some aspects of quark-gluon plasma, an exotic state of matter produced when heavy ions collided at high energies. In collaboration with Starinets and Pavel Kovtun, Sơn was able to use AdS/CFT matching to calculate a key plasma parameter. As Sơn later recalled: "We did the calculation in his head to give us a prediction of the shear viscosity value of a plasma... A friend of mine in nuclear physics joked that ours was the first useful paper to come out of string theory."

Today physicists are still looking for applications of the AdS/CFT correspondence in quantum field theory. In addition to the applications to nuclear physics advocated by Đàm Thanh Sơn and his collaborators, condensed matter physicists such as Subir Sachdev have used string theory methods to understand some aspects of condensed matter physics. A notable result in this direction was the description, through AdS/CFT correspondence, of the transition from a superfluid to an insulator. Another emerging subject is  fluid/gravity correspondence, which uses AdS/CFT correspondence to translate problems in fluid dynamics into problems in general relativity.

 

Annex 19. A photon has gone back in time

How in quantum physics they are achieving what until now seemed impossible: reversing time

https://wolksoftcr.com/como-en-fisica-cuantica-estan-logrando-lo-que-hasta-ahora-parecia-imposible-revertir-el-tiempo/

The border between science and science fiction is sometimes almost imperceptible. And we owe it, of course, to our increasingly precise understanding of the world in which we live. That macroscopic world that we can see with our eyes and in which the processes seem to run in a single direction in time: from the present to the future.

We are so intimately accustomed to observing this phenomenon that we find it very difficult to accept the possibility of reversing a process over time. To recover it as it was before having undergone some change that we could consider permanent. But it's not impossible. Quantum physics has just shown us that it is feasible both theoretically and practically.

Quantum physics and our intuition are, once again, about to collide.

Our intuition invites us to conclude that the irreversibility of processes is a fundamental law. And the second principle of thermodynamics proves us right. It can be formulated in many different ways, but all of them, if correct, invite us to conclude that physical phenomena are irreversible.

If we place a container with very hot water on our kitchen countertop and do nothing with it, the water will cool. And if we drop a glass and explode when hitting the ground, it will not recompose itself. Precisely heat exchange and entropy are two properties intimately linked to the second principle of thermodynamics.

 

Entropy is usually defined as the magnitude that measures the degree of disorder of a physical system. It is perhaps an oversimplification, but it can help us understand what we are talking about without being forced to resort to complex concepts. In any case, this thermodynamic principle is statistical in nature, and, moreover, classical physics is deterministic.

This means that it is possible to predict the evolution of a physical system over time if we know its initial state and the differential equations that describe its behavior. However, in the domain of quantum physics, in the world of the very small, of particles, the reversibility of physical processes is possible. It has been so from a theoretical point of view for a long time, and now it is also in practice.

Quantum physics allows it: a photon has gone back in time

Physicists flirt with the possibility of reversing processes over time for many years. In fact, some theorists work on some very peculiar tools that quantum mechanics has placed in their hands: universal rewinding or rewinding protocols. We do not need to know in detail how these mechanisms work, but it comes from pearls to know that they serve to reverse the changes that a quantum system has undergone without knowing what its initial state was. And without knowing what those changes consisted of.

Universal reversal protocols serve to reverse the changes that a quantum system has undergone without knowing what its initial state was.

It almost looks like magic, but it's not. It's science. And, precisely, the Spanish theoretical physicist Miguel Navascués leads a research team at the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences expert in this discipline. Miguel and his collaborators have designed an innovative theoretical reversal protocol that proposes, broadly speaking, what procedure can be used to get a quantum system to recover its initial state without knowing what changes it has undergone.

Putting something like this into practice is not easy, which has meant that experimental physicists working in this area have not been very successful. Fortunately, the picture has changed. The team of experimental physicists at the University of Vienna led by Philip Walther has successfully implemented the universal reversal protocol designed by Miguel Navascués and his team.

At the heart of their experiment is sophisticated optical equipment consisting of several interferometers and fiber-optic links that behave together like a quantum switch. Knowing in detail how this ingenuity works is beyond the purpose of this article because, as we can guess, its complexity is extraordinary. Even so, those who are not easily intimidated and curious can consult the article published by Navascués, Walther and their teams in the journal Optica. It is very worthwhile.

The heart of their experiment is sophisticated optical equipment consisting of several interferometers and fiber optic links that behave together like a quantum switch.

A note before moving on: an interferometer is an optical device that uses a light source (usually a laser) to measure very accurately the changes introduced in a physical system. Described in this way it seems very complicated, and yes, it is complicated, but we can resort to an example close in time to illustrate what we are talking about.

The LIGO experiments in the United States and Virgo in Italy used to identify and analyze gravitational waves are interferometers. And, as we have just seen, both incorporate sophisticated optical equipment and a laser that allows them to measure the gravitational perturbations generated by massive objects in the cosmos that are subjected to a certain acceleration. These perturbations propagate across the space-time continuum at the speed of light in the form of waves, and interferometers pick them up.

In some ways the quantum switch that Navascués and Walther's teams have built is similar to LIGO or Virgo, but on an infinitely smaller scale because its purpose is to identify and measure the changes introduced in a quantum system. What they have achieved is astonishing: they have successfully reversed the evolution in time of a photon without previously knowing either its initial state or what changes it had undergone. In practice it is the same as traveling back in time.

 

This scheme describes the ingenious optical equipment designed by researchers from the University of Vienna and the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences.

It seems reasonable to think that achieving this with a single particle, with a photon, is not too interesting, but nothing is further from reality. The result obtained by these researchers, which has already been peer-reviewed, is extraordinary because it opens wide the doors that will probably allow us to understand much better the rules that underlie the world in which we live. The rules, in short of quantum mechanics.

What allows this experiment to stand out from previous ones that also sought to demonstrate the possibility of reversing the state of a quantum system is that the universal reversal protocol of Navascués and Walther has managed to do so without having any prior information about the state of the quantum system. We can see it as if they had managed to perfectly recompose a porcelain vase without knowing the number of fragments they had initially, their shape, much less that they belonged to a vase and were porcelain.

In the conclusions of their article these researchers insist on something very important: the results they have obtained are not valid only in quantum systems of photonic nature, which are those that work with light; They are consistent with other quantum systems. For this reason, the applications of this technology can be very numerous, especially in the field of quantum computing.

Universal reversal protocols can theoretically be used to solve one of the biggest challenges currently posed by quantum computers: error correction. In fact, this is probably the highest wall that  quantum computing researchers will have to break down to get quantum computers to be able to solve the kinds of complex problems at which they are theoretically far superior to classical supercomputers.

 

Annex 20. Lambda-CDM (Cold Dark Matter) Cosmological Model

Lambda-CDM model:

https://es.wikipedia.org/wiki/Modelo_Lambda-CDM

In cosmology,    the   Lambda-CDM  (Lambda-Cold Dark Matter) model represents  the Big Bang concordance model that explains cosmic observations of the microwave background radiation as well as the large-scale structure of the universe.    and observations  of supernovae, shedding light on the explanation for the acceleration of the expansion of the Universe. It is the simplest known model that agrees with all observations.

  • Λ (lambda) indicates  the cosmological constant as part of a dark energy term that  allows us to know the current value of the accelerated expansion of the Universe that began about 6 billion years ago. 1 The cosmological constant is described in terms  of ΩΛ, the fraction of energy density of a flat universe. At present, ΩΛ≃ 0.70, implying that it is equivalent to 70% of the energy density of the present universe.
  • Cold dark matter is the  model  of dark matter in which the speed of its particles is much lower than  the speed of light, hence the adjective "cold". Cold dark matter is non-baryonic, unlike normal baryonic matter with which it does not interact except by gravity. This component constitutes 26% of the energy density of the current universe. The remaining 4% is all matter and energy (baryonic matter), which make up  the atoms and photons that are the building blocks  of planets, stars and gas clouds in the universe.
  • The model assumes a near-scale invariance spectrum  of  primordial perturbations and a universe without spatial curvature. It also assumes that it has no observable topology, so that the universe is much larger than  the observable horizon of the particle. Predictions of cosmic inflation are given.

The model assumes that General Relativity is the correct theory of gravity on cosmological scales. It is often referred to as the standard model of Big Bang cosmology, because it is the simplest model that provides a reasonably good explanation of the following properties of the cosmos:

  • The existence and structure of the cosmic microwave background
  • The large-scale structure of the galaxy distribution
  • The abundances of hydrogen (including deuterium), helium and lithium
  • The accelerated expansion of the universe observed in distant galaxies and supernovae

The ΛCDM model has been successfully simulated in supercomputers: starting from the composition of the Universe (atoms of hydrogen, helium, lithium, etc., photons, neutrinos,... 11.5 million years after the Big Bang, the simulation forms stars, galaxies and structures of clusters and superclusters of galaxies very similar to the real objects we observe in the sky2The ΛCDM model can be extended by adding cosmological inflation, quintessence and other elements that are current areas of study and research in Cosmology.

 

External links:

https://naukas.com/2018/06/13/ultimo-articulo-hawking-la-naukas-iii-propuesta-ausencia-frontera/

http://hyperphysics.phy-astr.gsu.edu/hbasees/quantum/barr.html

http://hyperphysics.phy-astr.gsu.edu/hbasees/Particles/expar.html

https://www.konradlorenz.edu.co/blog/que-son-los-agujeros-de-gusano/

https://es.wikipedia.org/wiki/Universos_paralelos

http://www.nocierreslosojos.com/teoria-cuerdas/

https://www.epe.es/es/tendencias-21/20220907/universo-tendria-companero-antimateria-lado-75131498

https://www.abc.es/ciencia/abci-bang-pudo-fabricar-futuros-diferentes-202103070858_noticia.html

https://estudiarfisica.com/2015/06/26/el-principio-holografico-el-mas-bello-avance-hacia-la-gravedad-cuantica/

https://gefesrsef.wordpress.com/2016/12/11/las-propiedades-emergentes-y-su-papel-en-la-superconductividad/

https://es.wikipedia.org/wiki/Efecto_Meissner

https://www.bbc.com/mundo/noticias-64065872

https://ecoosfera.com/sci-innovacion/energia-oscura-fuente-agujeros-negros/?utm_content=cmp-true

https://tunelcuantico.home.blog/2019/02/16/el-efecto-tunel-a-detalle/

https://es.wikipedia.org/wiki/Dualidad_onda_corp%C3%BAsculo

http://hyperphysics.phy-astr.gsu.edu/hbasees/debrog.html

https://es.wikipedia.org/wiki/Relaci%C3%B3n_de_indeterminaci%C3%B3n_de_Heisenberg

https://www.gsjournal.net/Science-Journals/Research%20Papers-Mechanics%20/%20Electrodynamics/Download/748

https://culturacientifica.com/2023/04/04/integral-de-caminos/

https://es.resonancescience.org/blog/la-catastrofe-del-vacio-2

https://es.wikipedia.org/wiki/Part%C3%ADcula_virtual

https://www.curiosamente.com/videos/que-es-la-gravedad-cuantica-de-bucles

http://www.javierdelucas.es/vaciomedir.htm

https://es.resonancescience.org/blog/la-catastrofe-del-vacio-2

https://significado.com/ecuacion-de-dirac/

https://es.wikipedia.org/wiki/M%C3%A9trica_de_Alcubierre

https://tallcute.wordpress.com/2010/07/05/los-10-saltos-evolutivos-mas-complejos/

http://neofronteras.com/?p=3012

https://triplenlace.com/2014/01/16/la-quimica-del-sol/

https://es.resonancescience.org/blog/confirmacion-de-la-resonancia-cuantica-en-los-microtubulos-del-cerebro

http://www.nocierreslosojos.com/teoria-cuerdas/

https://es.knowablemagazine.org/article/physical-world/2022/agujeros-negros-primordiales

https://www.wikiwand.com/es/Correspondencia_AdS/CFT

https://wolksoftcr.com/como-en-fisica-cuantica-estan-logrando-lo-que-hasta-ahora-parecia-imposible-revertir-el-tiempo/

https://es.wikipedia.org/wiki/Modelo_Lambda-CDM