Good point! Here's the explanation for why they don't look white in the Sandbox:
The brightness of an object in an image is determined by the amount of light it emits as well as the camera settings (aperture, exposure time, sensitivity of the medium etc). When you take the picture you're setting everything up to make the result fall into a useful range. In the image you posted, the goal was to pick out the details of the ISS and the glory of the sun's light at the same time.
Note the absence of other stars in the background of the image? They should be visible, there's no atmosphere to hide them... but at the exposure necessary to see the ISS they are invisible, as their comparatively low light emission doesn't get enough time to make it into the image. If you were to expose the image long enough to capture the background stars, the ISS and our sun would look like a featureless white blob of sheer inferno.
To be able to pick out the surface detail of the sun (and other bright objects) in Universe Sandbox² we deliberately set our "virtual exposure" to be quite low. In many cases, it's more important to us to show the nature of things in a detailed fashion than to depict them like you would see them with the naked eye. If we set it up to look like you would see it with your naked eye, we would
- need a monitor / display device that can be at least as bright as the sun
- be unable to show anything else on screen as the sun would just white-out everything
- have to start writing in a braille manual (you'd be blind pretty soon).
There are ways to handle exposure dynamically based on image content (HDR rendering with dynamic exposure). Those algorithms mimic the adaptation process of the human eye and also put all objects in the scene into a useful range by adjusting exposure like a photographer would. But even using those wouldn't help us much: the sun would still be featureless and white, and everything else in the scene would have to be rendered a lot darker.
Seems that our eyes just aren't the best tools for space exploration
Luckily, we can adjust eye sensitivity by using filters, like the ones used for watching a solar eclipse, to avoid the light saturating the cones in our retinas (and ultimately burning them).
Which is what we're doing with stars. To show you the shifting patterns of the granules on their surface. To let you see flares, impacts, differences between star classes.
If you'd like an even more in-depth explanation, visit
http://www.vendian.org/mncharity/dir3/starcolor/And stay tuned for more star display modes to come, ultimately we want to show what they look like in different spectra, and add some more eye candy too
- Georg
P.S: It's completely impossible to compare colors of digital images with someone else as long as your monitor and the monitor of the user you're comparing it with are not calibrated to the same specs! For one user the sun in the image posted above may look yellowish, blueish for another, both may have different impressions of contrast and brightness... so expect and accept to disagree on colors