A quick Google image search on “black holes” will have anyone ohh-ing and ahh-ing at the spectacular illustrations that come up on the browser. Though these images may only be pixels in size, they are able to give us a glimpse into the scale of these cosmic giants.
Sadly, these pictures are just that—nothing more than illustrations. They represent only what researchers have assumed black holes would look like, because actually capturing an image of one on a camera is quite a daunting challenge.
Lead researcher and MIT grad student Katie Bouman explained that black holes are very compact and far away. She also added that taking a picture of the black hole at the center of the Milky Way galaxy is, “equivalent to taking an image of a grapefruit on the Moon, but with a radio telescope.”
This entailed the use of telescope with a diameter of at least 10,000 km—for comparison, the Earth’s diameter is scarcely 13,000 km.
Now that doesn’t sound very practical, does it?
But Bouman and her team from MIT and Harvard came up with a solution that uses an algorithm to stitch together data acquired by radio telescopes positioned all over the Earth to create an image of a black hole—a project called the Event Horizon Telescope.
As black holes do not emit visible light (at least not directly), the team relied on radio signals to get an idea of how a black hole looks. “Just like how radio frequencies will go through walls, they pierce through galactic dust. We would never be able to see into the center of our galaxy in visible wavelengths because there’s too much stuff in between,” says Bouman.
Larry Hardesty of MIT explains that radio telescopes also require large antenna dishes. To overcome this, Bouman and her team decided to turn the Earth into an antenna dish itself by involving as many radio telescopes as possible—a technique that uses interferometry to combine data from around the world. So far, six observatories have already signed onto the Event Horizon Telescope project.
The plan is to have all these telescopes zero in on the black hole at the center of the Milky Way—a 4-million-solar-mass behemoth called Sagittarius A* (pronounced “A-star”). Data collected by the telescopes will be processed using the Continuous High-resolution Image Reconstruction using Patch priors algorithm, or CHIRP. CHIRP can also be used to intelligently “fill in” the parts of the black hole that weren’t captured by the telescope.
The MIT and Harvard teams plan to present their project on June 27 during the Conference on Computer Vision and Pattern Recognition in Las Vegas.
Should their research have scientific merit, direct images of black holes may just be a snapshot away!