Assuming that I hope to understand the difference in bit deph

existing between a RGB image (24bit of color depth or 16.5M

colors) and a RGBa image (24bit of color depth plus 8bit

alpha channel).

I'd like to know whether 24/32bit , in a graphic card/PC

monitor loop , works the same way or not?

Pratically , my question is:
In a Pc monitor , does 24bit color depth generate the same

amount of colors as for 32bit color depth or not?

And if yes what is the difference?

Bye Luca