24 vs 32bit color depth

jamesvalue

jamesvalue

New Member
#1
Hi!


Assuming that I hope to understand the difference in bit deph

existing between a RGB image (24bit of color depth or 16.5M

colors) and a RGBa image (24bit of color depth plus 8bit

alpha channel).

I'd like to know whether 24/32bit , in a graphic card/PC

monitor loop , works the same way or not?

Pratically , my question is:
In a Pc monitor , does 24bit color depth generate the same

amount of colors as for 32bit color depth or not?

And if yes what is the difference?

Bye Luca
 
MuFu

MuFu

Moderator
#2
Welcome to HWC. :)

The accepted terminology seems to be "colour mode". For example, you have a "32-bit colour mode" or "32-bit display mode"... this can consist of 10-bits per colour component (10R + 10G + 10B) and a 2-bit alpha channel. In fact, this is what Matrox do with the Parhelia. The same 32-bit mode could also consist of 8-bits per component and 8-bit alpha. Alpha channel information is never sent to a monitor - it is always combined with the colour components.

There's a pretty good write up on it here if you're interested.

MuFu.
 
jamesvalue

jamesvalue

New Member
#4
Originally posted by MuFu

The accepted terminology seems to be "colour mode". For example, you have a "32-bit colour mode" or "32-bit display mode"... this can consist of 10-bits per colour component (10R + 10G + 10B) and a 2-bit alpha channel. In fact, this is what Matrox do with the Parhelia. The same 32-bit mode could also consist of 8-bits per component and 8-bit alpha. Alpha channel information is never sent to a monitor - it is always combined with the colour components.

There's a pretty good write up on it here if you're interested.

MuFu.
Thanks again!

The URL you posted , was very effective!

Bye Luca (I'm learning a lot of new stuff about video cards!)


:)
 

Associates