Here is a panoramic photo by 6k, of a striking local (to him) scene. Panoramic presumably means that he photoed a big spread of photos and then some cunning computer programme stitched them together into what you now see:
Like 6k says, wow. That’s my 1000 thingies across version, but the original is massively bigger. From it I picked out these very distant mountains, and even they had to be shrunk to fit here properly:
Thereby making the already horizontalised even more horizontalised. And in this case I’m horizontalising with an actual horizon.
I assume that these very distant mountains are blue for the same sort of reason that the sky is blue, which is that between us and it there is lots of space for blue light to wander into the picture, because blue light does that, more than other sorts of light. It must also help that in the foreground of the picture there is lots of yellow and orange, like one of those photos of an indoor scene at night, artificially lit, which turns the grey outdoors that you see through the window into bright blue, which it really isn’t when you look at it.
I sense also that this illusion is relevant. It shows how our eyes adjust when scanning the same thing but in a setting that changes, in a way we just can’t stop ourselves doing. Which cameras don’t do. It takes software to do that.
So, we don’t see those mountains as blue when we home in them, but when a camera doesn’t home in on them, but is being very hi-res and we merely crop out the distant mountains, they’re blue.
Hi Brian
I was also wondering why those mountains were blue. I’m guessing that it’s more to do with my lens and sensor being unable to deal with the massive amount of light (clear skies, reflective top surface of clouds). But I don’t know.
For the record, the larger peaks of you horizontalised mountains are (from right to left) Devil’s Peak, Table Mountain and the Constantiaberg, one of which you’ll almost certainly have heard of, but you’re looking at the side of it here, rather than its iconic face. 🙂