Question about “magnitude”

There is something that has always bugged me about the evolution of the measurement of the brightness of a star, also know as it’s apparent magnitude. I understand, in principle, the notion of both apparent and absolute magnitude. What troubles me is the evolution of the idea. As I understand it, Hipparchus was the first to attempt to catalog stars by their relative brightness. He looked for the twenty brightest stars and called the “first magnitude”; then he took another grouping of dimmer stars and those became “second magnitude”, and on and on until he had the dimmest stars, just barely visible cataloged as “sixth magnitude”.

In the mid-1800s, Norman Robert Pogson made this quantifiable by showing that the average first magnitude star was 100 times brighter than the average sixth magnitude star. This means that ratio for 1 magnitude of brightness is 2.512, or that a magnitude 1 star is 2.512 x 2.512 brighter than a magnitude 3 star.

So my question: how does one measure the brightness of a star in order to put it into a given magnitude. When I look at the sky, sometimes the difference in brightness is obvious, but other times it isn’t. I can see doing it the way Hipparchus did it, by grouping, but in the mid 1850s, how did Pogson do it? What was the measurement of brightness (lumens?)? How is the brightness measured today? In other words, is there a range of brightness that qualifies for first magnitude?