There is a proof of the singular nature of black holes, but it is being ignored. This proof was mentioned in an old paper by Michael Rowan Robinson. It can be found in ArXiv sometime after 1998. I ran across it by accident and I do not remember the title or year of publicatiion. I e-mailed him to ask if he could remember the paper where he made the comment.

In his comment, he said that it has been suggested that black holes, precisely and only because they are relativistic singularities, must posess an hyperbolic gravitational field. A singularity, as a single point mass with infinite density, must have a gravitational field that also tends to infinity as one approaches the center. The 2D profile of such a field, therefore, can be represented by an hyperbola. Normally, when an object exists in spacetime, it presents with an overall parabolic field profile according to Newton's Law of Gravity. Such a field will fall off as 1/r^{2}. But, a hyperbolic gravitational field will fall off much much more slowly, as 1/r. But, I use the term 1/r* where r* is the dimensioned numerical value of the unit vector associated with r. If the unit vector ofisrthe dimensioned numerical value ofr_{1}is Num(r_{1}) = s and r* = sr. So, hyperbolic F =GmM/r*..r_{1}

In 1983 Mordehai Milgrom announced that he had discovered a new twist in Newton's Law of gravity. He studied a statistically significant number of spiral and other types of galaxies that had redshift measurements made of the rotational velocity distribution of their component stars. When he plotted velocity of these stars versus distance from the center, velocity did not fall to near zero as it should have at large r. Newton's Law was wrong!

Of course it was. Milgrom's galaxies had supermassive black holes embedded within them. The central black hole and even also the associated matter in the disk induced a hyperbolic gravitational field in spacetime even very far from the center, that is, as r tended to infinity. A hyperbolic field will tend to zero only very slowly at large r compared to a parabolic field. In fact, there is a near constant difference between a parabolic Newtonian field and a hyperbolic field generated by the same mass, as r tends to infinity. This near constant difference accounts for Milgrom's very small residual centripetal acceleration constant that he used to mathematically summarize his findings as an addition to Newton's Law. Hence, he called his model "modified Newtonian dynamics" or MOND.

He did not respect the implications of supermassive relativistic black holes in the nuclei of his galaxies. In 1983, most scientists hardly even knew of them. So, Dark Matter was proposed to account for the MOND effect. But, Dark Matter is unnecessary. No enormous clouds of hypothetical "weakly interacting massive particles" or WIMPs are needed to account for the MOND effect. But, neither is a fundamental modification of Newton's Law of Gravity. This has huge implications to the so-called Lambda/Cold Dark Matter model of the universe that is based on the Friedmann equations and the FLRW metric (the so-called "Standard Model").

Science is missing an opportunity here. The existence of the MOND effect proves the nature of supermassive black holes as true singularities. One can mathematically prove that relativistic singularities must exist by means of the treatment of general relativity given by Schwartzchild. But, here is observational (experimental) proof that is as rock-hard and undeniable as such proof ever gets. It is more important to find more ways to validate an all-encompassing theory like general relativity than it is to find ways to validate one particular favored model of the universe by inventing Dark Matter (and Dark Energy) to fix the gaping holes. This is the true meaning of the MOND effect. See http://lonetree-pictures.net for more.