Similarly the Veo example of the northern lights is a really interesting one. That's not what the northern lights look like to the naked eye - they're actually pretty grey. The really bright greens and even the reds really only come out when you take a photo of them with a camera. Of course the model couldn't know that because, well, it only gets trained on photos. Gets really existential - simulacra energy - maybe another good AI Turing test, for now.
Human eyes are basically black and white in low light since rod cells can't detect color. But when the northern lights are bright enough you can definitely see the colors.
The fact that some things are too dark to be seen by humans but can be captured accurately with cameras doesn't mean that the camera, or the AI, is "making things up" or whatever.
Finally, nobody wants to see a video or a photo of a dark, gray, and barely visible aurora.
Living in northern Sweden I see the northern lights multiple times a year. I have never seen them pale or otherwise not colorful. Green and reds always. That is to my naked eye. Photographs do look more saturated, but the difference isn't as large as this comment thread make it out to be.
Even in Upper Michigan near Lake Superior we sometimes had stunn, colorful Northern Lights. Sometimes it seemed like they were flying overhead within your grasp
I'm in Australia where the southern lights are known to be not as intense as northern lights. That's where my remark comes from. Those who have never seen the aurora with their own eyes may like to see an accurate photo. A rare find among the collective celebration of saturation.
Exactly. I went through major gas lighting trying to see the Aurora. I just wasn't sure whether I was actually seeing it, because it always looked so different from the photos. It is absolutely maddening trying to find a realistic photo of what it looks like to the naked eye, so that you can know if what you are seeing is actually the Aurora and not just clouds
Priming the opsins in your retina is a continuous process, and primed opsins are depleted rapidly by light. Fully adapting your eye to darkness takes a great deal of darkness and a great deal of time - on the order of an hour should set you up.
Most human beings in arctic regions live in places and engage in lifestyles where it's impossible to even come close to attaining the full light sensitivity of the human retina in perfect darkness. The sky never gets dark enough in a city or even a small town to get the full experience, and if you saw your smart watch five minutes ago you still haven't fully recovered your night vision. Even a sliver of moon makes remote dark-sky-sites dramatically brighter.
Everybody is going to have different degrees of the experience because they'll have eyes with different degrees of dark adaptation. And their brains are going to shift around the ~10^3x dynamic range of the eye up or down the light intensity scale by a factor ~10^6, without making it obvious to them.
There's a middle ground here. I saw the northern lights with my own eyes just days ago and it was mostly grey. I saw some color. But when I took a photo with a phone camera, the color absolutely popped. So it may be that you've seen more color than any photo, but the average viewer in Seattle this past weekend saw grey-er with their eyes and huge color in their phone photos.
(Edit: it was still super-cool even if grey-ish, and there was absolutely beautiful colors in there if you could find your way out of the direct city lights)
The hubris of suggesting that your single experience of vaguely seeing the northern lights one time in Seattle has now led to a deep understanding of their true "color" and that the other person (perhaps all other people?) must be fooling themselves is... part of what makes HN so delightful to read.
I've also seen the northern lights with my own eyes. Way up in the arctic circle in Sweden. Their color changes along with activity. Grey looking sometimes? Sure. But also colors that are so vivid that it feels like it envelopes your body.
> The hubris of suggesting that your single experience of vaguely seeing the northern lights one time in Seattle has now led to a deep understanding of their true "color" and that the other person (perhaps all other people?) must be fooling themselves is... part of what makes HN so delightful to read.
The person they were responding to was saying that the people reporting grays were wrong, and that they had seen it and it was colorful. If anything, you should be accusing that person of hubris, not GP. All GPS point was, is that it can differ in different situations. They used the example of Seattle to show that the person they were responding to is not correct that it is never gray and dull.
The human retina effectively combines a color sensor with a monochrome sensor. The monochrome channel is more light-sensitive. When the lights are dim, we'll dilate our pupils, but there's only so much we can do to increase exposure. So in dim light we see mostly in grayscale, even if that light is strongly colored in spectral terms.
Phone cameras have a Bayer filter which means they only have RGB color-sensing. The Bayer filter cuts out some incoming light and dims the received image, compared with what a monochrome camera would see. But that's how you get color photos.
To compensate for a lack of light, the phone boosts the gain and exposure time until it gets enough signal to make an image. When it eventually does get an image, it's getting a color image. This comes at the cost of some noise and motion-blur, but it's that or no image at all.
If phone cameras had a mix of RGB and monochrome sensors like the human eye does, low-light aurora photos might end up closer to matching our own perception.
I can see what you mean, and that the video is somewhat not what it would be like in real. I have lived in northern Norway most of my life, and watched Auroras a lot. It certainly look green and link for the most time. Fainter, it would perhaps sorry gray I guess? Red, when viewed from a more southern viewpoint..
I work at Andøya Space where perhaps most of the space research on Aurora had been done by sending scientific rockets into space for the last 60 yrs.
That not true, they look grey when they aren't bright enough, but they can look green or red to the naked eyes if they are bright. I have seen it myself and yes I was disappointed to see only grey ones last week.
> [Aurora] only appear to us in shades of gray because the light is too faint to be sensed by our color-detecting cone cells."
> Thus, the human eye primarily views the Northern Lights in faint colors and shades of gray and white. DSLR camera sensors don't have that limitation. Couple that fact with the long exposure times and high ISO settings of modern cameras and it becomes clear that the camera sensor has a much higher dynamic range of vision in the dark than people do.
The brightest ones I saw in Northern Canada I even saw hints of reds - but no real greens - until I looked at it through my phone, and it looked just like the simulated video.
If I looked up and saw them the way they appear in the simulation, in real life, I'd run for a pair of leaded undies.
That is totally incorrect which anyone who have seen real northern lights can attest to. I'm sorry that you haven't gotten the chance to experience it and now think all northern lights are that lackluster.
Greens are the more common colors, reds and blues occur in higher energy solar storms.
And yes, they can be as green to the naked eye in that AI video. I've seen aurora shows that fill the entire night sky from horizon to horizon, way more impressive than that AI video with my own eyes.
This is such an arrogant pile of bullshit. I’ve seen very obvious colors on many different occasions in the northern part of the lower 48, up in southern Canada, and in Alaska.
To be fair, the prompt isn’t asking for a realistic interpretation it’s asking for a timelapse. What it’s generated is absolutely what most timelapses look like.
> Prompt: Timelapse of the northern lights dancing across the Arctic sky, stars twinkling, snow-covered landscape
That doesn't seem in any way useful, though... To use a very blunt analogy, are color blind people intelligent/sentient/whatever? Obviously, yes: differences in perceptual apparatus aren't useful indicators of intelligence.
To add a bit of color (ha) I was with my color-sighted spouse at a spot well known for panoramic views. 50ish people there. Many conversations happening around me.
“I can’t see anything”
“Maybe that’s something over there?”
“What’s everyone looking at?”
Someone shows their phone.
“Ooh!” “How do you turn on night mode?” “Wow it’s so much clearer on the phone!”
So I can’t know what their eyes see or what they really think, I could hear what came out of their mouths.
I don’t think this is an instance that warrants deep philosophical skepticism about the nature of truth or the impossibility of knowledge.
For decades, game engines have been working on realistic rendering. Bumping quality here and there.
The golden standard for rendering has always been cameras. It’s always photo-realistic rendering. Maybe this won’t be true for VR, but so far most effort is to be as good as video, not as good as the human eye.
Any sort of video generation AI is likely to have the same goal. Be as good as top notch cameras, not as eyes.
What struck me about the northern lights video was that it showed the Milky Way crossing the sky behind the northern lights. That bright part of the Milky Way is visible in the southern sky but the aurora hugging the horizon like that indicates the viewer is looking north. (Swap directions for the southern hemisphere and the aurora borealis).
that's a bad example since the only images of aurora borealis are brightly colored. What I expect of an image generator is to output what is expected from it