The fact that gravity sucks has been known for quite some time. Once scientists discovered that light was also effected by gravity, then it was only a matter of time before someone did a calculation to figure out how much mass was required to stop light from escaping—the black hole was no longer associated with Calcutta but with a new astronomical entity. Since then black holes have been studied extensively with scientists thinking about such topics as information, entropy (a measure of the ways in which radiation and matter can be ordered), and the properties of severely bent space time.

Now researchers have begun considering if there is a limit to how small a mass can become a black hole. The idea is that it is not just the total mass that decides if a clump of matter can become a black hole but rather the density of that mass. However, matter and radiation also have entropy and temperature, which also limits the density to which a given mass can be compressed.

To study the problem, the researchers consider a volume of space that still has a very small amount of self gravity (e.g., it is slowly collapsing in on itself). Under such conditions the collapse is very smooth (technically called adiabatic) and importantly the gravity field has very little entropy, which means it can be ignored. They examined the entropy and temperature conditions of the matter and radiation within the volume as it collapsed. What they show is that the mass required to make a black hole increases as the entropy increases and the density required increases as the temperature increases.

Even without attempting to include quantum gravity, the Planck mass naturally arises. In the case of studying the minimum mass of a black hole this is unsurprising, however, this result also implies that second law of thermodynamics will be modified by quantum gravity. While this may not surprise anyone working in the field, it certainly surprised me.