How a black hole got pictured for the first time

Share this

It’s massive, spanning twice the solar system, yet it’s so tiny it required never-before seen levels of artificial intelligence to map out and 8 entire observatories to picture.

It’s a black hole in the centre of a nearby galaxy.

Designated Messier 87, M87 for short, the black hole is located about 53 million light years from Earth. With such great distance, it means it only occupies a tiny bit of our skies. That adds to the difficulties of picturing it… that and the fact that it is a black hole and those tend to suck in light rather than reflecting or emitting it. Luckilly, if you catch them at the right angle, you can get a glimpse of a plasma field surrounding it though. This plasma field is the doughnut we have all seen by now.

But because the area it occupies in the sky is so tiny, even with the resolution given with 8 arrays of telescopes pointed in the same direction over a 24 hour period, you will not get a very clear picture. In fact, you will not get much information at all, relatively speaking. If you imagine the whole collective of information gathered was a single picture that takes up your entire screen, the amount of data that is relating to M87 would take up less than a single pixel. So how do you get a picture out of that?

With great difficulty and a brand new type of sorting algorithm invented by two brilliant computer scientists who specializes in neural network and sorting big data: Katie Bouman and Andrew Chael, both working on different parts of the code. Andrew focused on the overarching container for the code, while Katie focused on the sorting algorithm itself. Together with dozens of volunteers over several years, they pieced together an artificial intelligence that could sort out the black hole data from all the background data, leaving “only” a few petrabytes.

The data believed to be relevant was then sent to 4 different teams, all working independently from each other to create an image from that data. If all of them came to a similar conclusion, they would know the algorithm worked. And they did. The image you see is not one example, but the one produced by all 4 teams.

The landmark implication of this is not that we can take pictures of black holes in far away galaxies, but that we can now process images in a completely different way. The techniques are already being implemented in things like MRI scans to help detect cancer and other illnesses better and in subnautical photography to map our ocean floors more accurately. And of course, it will be an invaluable tool for 3D scanning, as background noise has always been the number one culprit for bad scans.

But yeah, we can also see black holes in far away galaxies, which is super cool and something I never expected to see in my lifetime.

Leave a Reply

Your email address will not be published. Required fields are marked *