"All generalisations are false,
including this one." - Mark Twain
FOREWORD
If you are an expert in color management, if you are a seasoned professional working in the prepress field, if you are a color guru of sorts, this article is not for you: you won’t find any of the commonly debated topics here, nor dissertations about deltas, 3dLUTs, fancy acronyms or complex maths; this is a “survival guide” for photographers and not only you are supposed to already know what I’m about to type, but more than probably you’ll disagree with some (most?) of my statements and conclusions.
Colorimeters are an ever more common presence amongst photographers, incuding the so called “amateurs”, also thanks to the availability of relatively cheap units. Alas, such instruments often produce unexpected results, days spent staring at a display and, lastly, frustration. This article is meant to shed some light (huh!) in a rather complex matter, from an end-user perspective (as in “one who doesn’t know and is not interested in knowing the whole mumbo-jumbo, yet wants to achieve consistent and effective results”).
And, sorry, it had to be long.
HOW OR WHY?
Let’s start right away with the bad news, with regard to one of the most common beliefs relating to calibration: will a properly calibrated display get you “better looking” pictures on-screen?
No, I’m afraid, it won’t. On the contrary, it will probably produce “worse” pictures, where the extent of the word “worse” mostly depends on your expectations.
Now, in order to understand why you should bother to even get started with calibration, I’ll better provide some definitions first.
A colorimeter is a device meant to measure your display and match it to a number of given parameters YOU have to set, whose values depend on the known purpose of the very calibration.
Even starting from this (admittedly simplistic) definition, you can already understand where the problem lies: indeed, the problem is you.
You are in fact supposed to know not only HOW to calibrate your display, but first and foremost WHY.
Now answer to this question: how often do you print and/or how relevant is printing consistency in your typical workflow?
a) Often - Very relevant.
b) Sometimes - Somewhat relevant.
c) Never - Not relevant.
I haven’t got reliable statistical data here, but I bet, in this day and age, most of the (honest) replies would go to c). Anyway, whatever you replied, the colorimeter will serve some useful purpose in your workflow, but in rather different fashions.
We’ll get there soon, but first lets have a look at those aforesaid parameters you have to set, which are essentially three. If you are interested in understanding the meaning and the relevance of each parameter, online you’ll find plenty articles fathoming them pretty well (be ready for some well complex readings, though). As I said, here I’ll stay on a more trivial plan.
White point determines the target chromatic temperature to which the... wait for it... “white” on your screen will be set after calibration, but it will indeed affect its whole tonality. Depending on your starting point, it could make your display look warmer or cooler. Illuminant D65 is the accepted “standard” value for most purposes, corresponding to the color temperature of a sunny day at noon. More than probably, it's also your display’s “native” white point. Unless you have to achieve a carefully weighed and purposely made calibration, you won’t need to consider this value any further: it's D65.
Gamma correction is a more complex matter: it manages the non linear nature of the human eye when “interpreting” different levels of light and color, and how that translates into the digital realm. You don’t want to fathom any deeper the topic, want you? If yes, then you already know what to do. For the purpose of this article, you just need to know that the most widespread target value for most of the displays out there is 2.2.
Luminance is apparently the simpler of the three, since it determines the “brightness” (in commas as I'm consciously using a wrong term) of your display. Guess what? That’s where most of your frustration comes from. In fact, I won’t give any “standard” value here, at the moment.
Now we can consider the answer you gave a few lines above.
If you replied a) the relevance of the colorimeter is crucial, except… Except you are using the wrong tool.
If you replied b) the colorimeter will help you a lot, assuming you know what you are doing and why.
If you replied c) the colorimeter still is a useful tool, but probably in a different way than you might imagine right now.
PICK YOUR POISON
a) You mostly print and need to stay consistent
So you are either a professional or someone very serious about printing.
This first case in fact depicts the typical professional purpose of display calibration AND profiling (yes, those are different terms having different meanings).
And that’s where the parameter I left unspoken above comes to a significant role.
When it comes to luminance, the value you’ll be hinted to use is 120cd/m2 (that’s probably the default value set by your very software for “Photo” purposes) but you’ll often be advised to input even lower values (particularly in dimly lit working environments), which will dramatically decrease the brightness of your display and produce not at all pleasing-to-look-at pictures.
So, I’ve purchased an expensive 500nits big-name display just to use it as cheap thingummy?
And where the hell this value comes from?
Well, I believe you’ve already discovered that printed materials do not “emit” their own light, they reflect the available light, which leads to the subtractive color synthesis used in prints.
Experts have estimated that setting a value of 120cd/m2 (or lower), will make your display look like a print (close enough, at least), and that's where this value comes from. This way you can anticipate how dark (or otherwise) the picture you are editing will look, once printed, though a certain amount of trial-and-error learning will be needed. Yet, don’t even think this will produce a reliable “soft-proof” of the final print, because you are still looking at an RGB (additive) picture while your final print will be in a subtractive color space; you’re still very far from the final printed result; light years away, actually.
Yet - point taken - this parameter is not meant to make your pictures look "good", but to make your pictures look "printed" before you actually print them.
Ideally you should set your display to a luminance level not brighter than your working environment (as a professional, you should work under controlled light conditions or use a viewing booth), and colorimeters can indeed measure the latter as well. But calibrating the display this way is not even half the job you have to cope with: for you, setting a wholly profiled “color link” is mandatory, so you also have to consider the camera, the video card, the scanner (if any) and the printer, the latter ranging from desktop printers to large format plotters to CMYK industrial offset press machines; all of the above possibly managed by a proper Raster Image Processor (you’ll in fact probably need one) and repeated for each different equipment (camera, display, video card, printer) involved in your workflow.
Discouraged? You’ll be.
Your colorimeter, in fact, won’t go any further than matching your display to the aforesaid temperature, gamma and luminance values, which is exactly what it’s made for, not taking into account any other element involved in your chain.
That’s why you’d be better served by a spectrophotometer: such instrument will not only produce a more precise reading of your display (since it doesn't make use of coloured filters to measure the emitted light, performing instead a proper spectral analysis), but it will also perform such analysis on any other source and/or output involved into a specific process, this way “profiling” your whole processing/printing chain: that’s the difference between “calibrating” and “profiling”, where the latter saves the "description" (or, better yet, "characterisation") of a given equipment into an ICC file, promptly available when you need it for a given purpose.
This way, when you print, the known (profiled) color space of your source is matched (as much as physically possible) to the known color space of your printer by means of an "intermediate" color space called Profile Connection Space.
Spectrophotometers are typically more expensive than colorimeters, ranging from about 600 Euros to the tens of thousands of Euros needed for an in-line device linked to the “pulpit” of an offset press machine, the likes of Heidelberg, KBA, Komori and so forth.
b) You accasionally print, but also publish on the Internet
It's a twofold situation, hence if you are trying to accomplish good results by means of a single calibration, you are doing it wrong.
If you are printing for yourself or within a small commercial business (where customers are possibly less picky than those needing a hi-end printing service) then a colorimeter might be enough, although a low-end spectrophotometer would not be overkill, given the rather low starting price, nowadays.
Either way, now you know why the D65/2.2/120 triad will serve you well when printing (with some fine tunings, if need be).
But what about that “other” purpose?
Well, read on, since the c) group is for you, too.
Right now, keep note of the fact that you need to set and save two different "profiles": one for printing, one to show your pictures on a display, switching between them accordingly when editing your pictures (which you must indeed edit in two different "variants").
c) You don’t print but rather clog the Internet with terabytes of wonderful shots
So, you have duly set white point to D65, gamma to 2.2 and luminance to 120 since everybody out there told you to do so, and now your pictures look crap.
Why, you might ask?
Because they are.
First of all ask yourself: what’s the point of setting your luminance level to 120cd/m2 or below, if you are not printing at all?
In fact there’s no point.
Truth be told, you wouldn't even need to calibrate your display to any given "standard", because you don’t have a clue about the technology (or lack of) used by the final viewer.
Imagine this scenario: you are a professional managing a printing service having to print a high quality photo book: you are very serious at that, so you have carefully profiled each and every single element in your chain. Only, you won’t actually print the whole run; instead, you’ll wait for each final customer and print a single copy of the book on whatever paper the customer brings: matte-coated, glossy-coated, uncoated, toilet paper… Whatever.
Crazy, right?
Well, that’s exactly what you are doing whenever you upload a picture to the Internet: since you don’t know what “paper” (the display) the final customer will actually bring, what your pictures will look is anyone's guess.
What you really need, instead, is to factor out any possible idiosyncrasy typical of your very display (technology, age, settings are all conditioning factors) not in order to have a “profiled” monitor, but rather a “reference” one.
Here is what you have to do:
set the brightness of your display to whatever level you feel comfortable with, not producing much strain to your eyes. This is indeed a very personal fact: I, for one, although my working environment is rather dim, when I'm calibrating for the Internet seldom set this value below 250cd/m2: that's where an average display is set at, in normal viewing conditions (whatever that means).
Your super-expert cousin will tell you this is plain wrong: ignore him.
Now that you have set your favourite luminance level (which you should possibly leave untouched from now on), start the calibration, setting in your software the already discussed parameters for white point and gamma, while setting the luminance level to “Original”, or “As measured” or whatever other definition your software offers. This way, your colorimeter will perform the calibration of the display without changing its luminance level. It will indeed correct any measured deviation from the given standards, in some cases in a pretty remarkable way.
Done.
Now have a look at your best picture: it still looks crap, right?
You clearly remember it looked better prior to calibration.
It did.
When you edited that (and any other) picture, you have set the parameters of your developing/editing software in order to make it look the way you wanted it to, but your display was not calibrated yet. Chances are your picture will now show a pinkish/yellowish/reddish color cast and some contrast is gone.
The solution is easy, though it's not one you'll be pleased to read: you have to edit your pictures all over again in order to make them look “right” as before.
So you have calibrated your display just to make your pictures look the same as they looked earlier?
Exactly: only, prior to calibrating, your pictures looked "right" on your display and on your display only; the calibration you’ve just ran, has set your display to common parameters, which doesn’t necessarily mean “good” parameters, just common. This way, your pictures will look right enough on most displays and perfect on none, but that’s a give and take situation and life is made of compromises.
SOME FINAL REMARKS
Let's do that again
If printing is your main goal, reacalibrating your display is not an option: it's mandatory. Screens change, light conditions change, and you have to stay consistent. So, repeating the calibration process once a week or two is expected (even more often, depending on how demanding your tasks are). If you are calibrating for the Internet (or whatever similar purpose) recalibrating every once in while is a good practice nonetheless, if anything, just to make sure nothing horrible has occured to your hardware; running a calibration once per month is advisable.
Something about the hardware
I won’t endorse any particular brand here, just be aware not all colorimetrs are made equal. Generally speaking, you should prefer a measurement unit using dichroic filters rather than organic ones: the latter will fade soon (how soon depends on how often you calibrate and how well you keep your equipment, but time is a factor in itself) leading to unreliable results. Again, search online and you’ll easily find out.
Something about the software
Your colorimeter is provided with a software capable of performing the appropriate tasks; you’ll also find that different editions of a given equipment are mostly differentiated not by the hardware (the measuring unit is the same or has slight differences in the firmware) but by the complexity of the software, where the simplest unit will allow you to set the aforesaid three parameters only or little more. If you have searched the Internet, you have probably already ran into a program named DisplayCAL (based on the Argyll CMS), which is regarded, and rightfully so, as the best one available at the moment, with the added bonus of being free.
But the sheer amount of options available there, will force you to a rather steep learning curve and often will lead to user errors, producing unpredictable results.
With regard to this software, I can only offer my personal experience (YMMV): after having correctly set all the relevant parameters (and then some), it produced a result completely indistinguishable from the one produced by the manufacturer's software (as expected, I’d say, otherwise one of the two had to be completely askew). Numerically, they show some (faint) differences, but from a perceptive standpoint, nobody would be able to tell which is which.
My hint: unless you have to cope with ultra hi-end or very specific needs, stay with the manufacturer's software: it’s made by people in the know and just in case some issue occurs, real and paid-for technicians are there to help you fixing any problem.
Yes, with DisplayCAL you can set hundreds to thousands different calibration targets (testcharts) in order to produce a finer calibration (at the expense of hours of measurements) but how relevant are they? Unless you are an extra galactic expert using some sci-fi multi display system (and you aren’t, if you are reading this article) they are not that relevant. Feel free to disagree.
Some words for my fellows Mac users (and not only) about DisplayCAL
Using a Mac has always been sort of a blessing and a punishment at the same time (this from one using Macs and Macs only since 1988).
Things have changed a lot recently (I still remember when Macs used 1.8 as their standard gamma, which led to endless debates), yet you have to consider some peculiarities, managing which properly is not trivial. And, if you are not using a Mac, read on nonetheless, since you are probably in our very same shoes, one way or another.
Now… I know you.
You couldn’t help using DiplayCAL despite advised otherwise.
And (I told you, I know you) you didn’t read its rather large and well written documentation.
Strange as it may seem, Mac users NEVER read a user manual, it must be ‘cause of some sort of anthropological issue, but Mac users are convinced that the day they’ll start reading a manual (even one of a washing machine), a nefarious calamity will hit them and their families.
Hence, here's some head-ups for you.
On the starting screen of this program (“Display & instrument” tab), either you have enabled the Advanced options or not, you’ll find a drop down menu named “Correction”: you’ll better set this parameter right otherwise you’ll find yourself before some seriously askew results. I'm not playing Captain Obvious here, this parameter is REALLY important, and although the documentation the program offers is pretty much detailed and perspicuous in this regard, you’ll find several info on the Internet which are plain wrong.
For those of you being on iMac or iMac Pro from 2017 and on, the right correction curve you have to apply is RG-LED (till something new is offered by the mothership with regard to the screen technology), for other displays you must carefully read the technical documentation coming with your machine and make your choice accordingly. The program itself offers the chance to download in-app more correction curves offered by end users: you can only trust them by faith, since no one can tell how exact they are; personally, I’ve found that some of them sets the program incoherently (eg. at the time of writing, the one precisely identifying the "2017 27” iMac", will automatically set the “Mode” parameter to “Refresh” while it has to be set to “LCD” offering no chance to revert it back, with consequences on the accuracy of the measurement I couldn’t tell: maybe negligible, maybe not; honest, I didn't bother to check). Personally, I would not trust anything not directly provided with the program itself.
Now go to the “Profiling” tab. If you have enabled the Advanced options (and you have), the first parameter you have to set is “Profile type”.
Everybody in the know will tell you to choose some LUT based profiling method (which will change the LUT loaded into your video card in order to make it "send" calibrated color informations to your display, rather than changing your screen settings), in order to achieve a more precise calibration: they are right, with some caveat, though.
The “Look Up Tables” are indeed the best way to do that if you are printing, but your OS does not supports them well (at all, actually, ATM), so, although third party color managed programs (e.g. Adobe PS and LR, Capture One, Affinity…) will apply them properly, some applications based on the OS' own color management will not (e.g. Preview). That’s a small thing, you’d say: it is, if you are creating a "personal" wholly managed printing profile, but that’s huge if you are working for the Internet, because some well established browsers out there won’t support them either. Chances are, if you are working for the Internet, that your pictures will look worse than ever, for someone.
But there’s more: if you are using a Mac and you have chosen such profiling method, the program itself will prompt you to switch to “Single curve + Matrix” and to check the “Black point compensation” box. You might feel disappointed by this, since this is the simplest method for profiling. Don’t: this method will “force” each of the R, G and B channels to stay coherent with a single curve for all, rather than to one curve for each. As a result, your grey scale "axis" will stay grey from black to white, without difficult to manage dominants. Believe me, I’d choose this option hands down anyways; if you are into CMYK industrial printing, that’s exactly the difference between using an UCR black generation curve instead of a GCR one: the latter will offer a (theoretical) broader color range, but could also produce dominants, depending on how capable and well equipped the printer is.
Finally, although more accurate, LUTs might produce a “harsher” rather than “smoother” gradients rendering.
Yet again, pick your poison.
Why I wrote this article
Because that’s exactly what I would have hoped to read when I started my own calibrating adventure, and I’m not referring to my main professional field (where I learned my lesson the hard way, many years ago) but to what's inherent to the group c), which by far and wide is the most relevant part for many, today. That would have saved me some more time for shooting, which is always a good thing.
Comments
This site is not open to comments. Yet, given the fact this is a rather “resonant” topic, if you wish to contribute something or have any question, feel free to use the contact form. Should something interesting be posted, it’ll be added at the footer of this page (time permitting). Please note: if you are an expert in color management, if you are a seasoned professional working in the… Well, you read that already, didn’t you?
Toti is a professional managing a pre/post-press company since 1996; besides the industrial production of books, his company also offers graphic projecting, finalization, optimization and photography services, for the printing and editorial industry.
The page was started with Mobirise template