Start a Conversation

Unsolved

Z

1 Rookie

 • 

2 Posts

75

May 11th, 2025 10:26

U3223QE, panel Native 10 bit?

Does this panel utilize FRC?

Chat-GPT:

🎨 Dell UltraSharp U3223QENative Color Bit Depth: 10-bit (8-bit + FRC)FRC (Frame Rate Control): YesThis information is corroborated by multiple sources, including the official Dell specifications and third-party specifications websites.


Out of chatGPT sources, the only source that actually states that is:
source: displayspecifications.com/en/model/7e822a35
and: forums.tomshardware.com/threads/color-depth-and-dell-u3219q-vs-benq-pd3220u.3608820/?utm_source=chatgpt.com  (but this doesn't even reference the models requested)

Thanks for anyone's help to figure this out finally.
Seems Dell havn't stated anywhere whether FRC is used in this panel, which i'd rather know.

Community Manager

 • 

56.8K Posts

May 13th, 2025 15:57

Dell in sales specifications or the monitor User's Guide has never listed 10-bit or 8-bit + FRC (Frame Rate Control).

Dell has only showed Color Depth =


6 bit = 262 thousand colors
8 bit = 16.7 million colors
10 bit = 1.07 billion colors

The U3223QE User's Guide only states Color Depth = 1.07 billion colors

(edited)

1 Rookie

 • 

2 Posts

May 14th, 2025 00:08

@DELL-Chris M​ Not really helpful, Benq told me their PD3205U is infact 10bit "utilizing 8bit+FRC", i'd like to know does the U3223QE utilize FRC or not? I know 'advertising' doesn't specify, but reality would like to know what is being bought. 

Thanks

3 Apprentice

 • 

732 Posts

May 16th, 2025 20:10

Since its an SDR monitor (maybe 2000:1 at best) it does not matter.

People had a lot of missconceptions about what 10bit is about.
If an app lacks of dithering like Photoshop(PS) if you let it to truncate (no dither) to 8bit output on monitor colorspace from image colorspace you may see "color managed induced banding". If you let PS output 10bit image on a OpenGL surface to driver it lets GPU driver or monitor itself to do its truncation to whatever actual bitdepth is, using dithering, hence no banding.
On SDR, 10bit output (FROM app TO driver) it's just a way to void dirty / simple truncations and drlegate actual truncation (if needed) to a proper working dither unit, be it on GPU driver like in some old macbooks or in monitor.

On the other side, LR (develop module), Adobe ACR or Capture One ave dithered outputs no they do not need at all this fancy stuff on drawing  OpenGL (publising to) 10bit surfaces to GPU.

On ACTUAL HDR displays, like OLEDs or those VA panel TVs or couble layer grading displays, full 10bit HDR output is neede because content needs it.
On fake HDR displays like those 1000:1 to 2000:1 IPS monitors best to do not use HDR at all or let an app handle it.

(edited)

No Events found!

Top