mil0sh Posted September 14, 2008 Share Posted September 14, 2008 I have noticed huge difference between my wallpaper on Windows and OS X. I am using Diabolik's NVkush on 10.5.4, but I noticed it before on 10.5 - 10.5.2 with "NVinject go" installed. OS X is reporting 32-bit, but I can only select 24-bit color depth. Any ideas? Link to comment Share on other sites More sharing options...
MacUser2525 Posted September 14, 2008 Share Posted September 14, 2008 I have noticed huge difference between my wallpaper on Windows and OS X.I am using Diabolik's NVkush on 10.5.4, but I noticed it before on 10.5 - 10.5.2 with "NVinject go" installed. OS X is reporting 32-bit, but I can only select 24-bit color depth. Any ideas? Go through the Calibration steps you can use when you select the Color option when you get to where you can select the gamma choose the 2.2 Television option this will make it use the display as if it was on a PC, all Mac's use a different gamma setting than a PC which can be hard to get use too especially if dual booting. Link to comment Share on other sites More sharing options...
mil0sh Posted September 14, 2008 Author Share Posted September 14, 2008 Gamma is ok, but as you can see, I can choose only 24-bit colors ("millions" in preferences). That is my problem. I tried to boot with mach_kernel -v "Graphics Mode"="1280x800x32@60", but with no success. Link to comment Share on other sites More sharing options...
MacUser2525 Posted September 14, 2008 Share Posted September 14, 2008 Gamma is ok, but as you can see, I can choose only 24-bit colors ("millions" in preferences).That is my problem. I tried to boot with mach_kernel -v "Graphics Mode"="1280x800x32@60", but with no success. Ah I see your hung up on the color myth I think this quote from the second link sums it up best. Johno, Eldmannen is correct. I've been doing computer graphics for over 30 years and I spent 5 years porting the X server to various machines. There are very few true 24 bit pixel machines left in the world. Now days, 24 bit color means 24 bits of color, which is 8 bits of each of red, green, and blue, packed on 32 bit boundaries with the 4th byte used for the alpha channel. Fact is that with modern video cards you can't really know anything about how the pixels are packed in words, all you can know is that there are 24 bits per pixel allocated to color. This idea reflects the difference between telling the truth in a specification and lying in an advertisement. Anyone who makes the mistake of believing anything the read in an ad pretty much gets what they deserve. In other words unlike Windows a Mac or for that matter Linux does not lie to you for marketing purposes it just uses the Alpha channel so you are getting what you are supposed too already. http://en.wikipedia.org/wiki/Color_depth#32-bit_color http://brainstorm.ubuntu.com/idea/4516/ Link to comment Share on other sites More sharing options...
Headrush69 Posted September 14, 2008 Share Posted September 14, 2008 MacUser2525 is right. On Windows only 24s bit are used for the actual colour as well. That you said there is a huge difference in the picture sure sounds typical of different gamma values. Link to comment Share on other sites More sharing options...
mil0sh Posted September 14, 2008 Author Share Posted September 14, 2008 Sorry, but that isn't it. The difference is in gradient from white to black. It's not so drastic as here. Link to comment Share on other sites More sharing options...
Slither2008 Posted September 15, 2008 Share Posted September 15, 2008 Sorry, but that isn't it. The difference is in gradient from white to black. It's not so drastic as here. Hi mil0sh, I had the same issue which I fixed by using the NVidiaEFI.kext which is modified to rectify that problem. From what I've read basically a Laptops display is detected as a DVI monitor and that is where hackintosh 10.5.x has trouble setting the bit-depth. It was discovered that when an analog VGA monitor was connected both the internal laptop display and the VGA analog display would register their proper 32-bit colour depth. One of the things the NVidiaEFI.kext does is emulate an attached analog device to your laptop so your Laptop display shows 32-bits regardless of whether a physical analog VGA monitor is attached. The downside to this is that I noticed when I boot up it registers 2 monitors if I use Expose. i.e 8 instead of 4 screens. A quick fix to this is to just detect the displays again in the display preferences. I have noticed if you have both NVinject.kext v 0.2.1 and NVidiaEFI.kext your dual monitor support only works when the device is plugged up during boot. If you try and detect displays it'll most likely {censored} itself with a blue screen like it does on mine. Try and just use the NVidiaEFI.kext on it's own and do a GeekBench benchmark or similar to gather which driver gives the better graphical performance. Here's the version I'm using : http://forum.insanelymac.com/index.php?sho...630&st=131# Good Luck! Regards, Slither2008. Link to comment Share on other sites More sharing options...
Recommended Posts