Big Universe, Big Data

Ross Andersen has a fascinating interview with JWST scientist Alberto Conti about the orders of magnitude increases in the amount of astronomical data being gathered these days:

There are two issues driving the current data challenges facing astronomy. First, we are in a vastly different data regime in astronomy than we were even ten or fifteen years ago. Over the past 25 to 30 years, we have been able to build telescopes that are 30 times larger than what we used to be able to build, and at the same time our detectors are 3,000 times more powerful in terms of pixels. The explosion in sensitivity you see in these detectors is a product of Moore’s Law—they can collect up to a hundred times more data than was possible even just a few years ago. This exponential increase means that the collective data of astronomy doubles every year or so, and that can be very tough to capture and analyze.

How Big Data Is Changing Astronomy (Again) [theatlantic]
Related: posts on the Palomar Observatory Sky Survey, an early decades-long attempt to photograph the universe.