Circle of Confusion
Now we're gonna talk about circle of confusion and how it affects something called "Bokeh" and how this works and why you're gonna want an expensive lens after this. So, circle of confusion controls Bokeh and it controls depth of field and I'm gonna be describing this to you in two-dimensional terms. I drew this little fishy-looking thing. But if you look at it, and here it is, I want you to think of this in three dimensions. So this is a cone, a physical cone of light when it comes in. So, there are several things going on here and what this represents, again, I just drew this really fast, this is a lens in a camera. This is the lens in your camera. This first thing here is a point source of light. A point source of light is like this, right here, this little light right here, or a light bulb. It's something...
Flame! Yes, matchflame, I can't hear anything. I'm like, matchframe? Matchflame? Matchflame, yes. Eeh, getting ...
old. Yeah, it's anything that we can see, like these lamps up here, these little beads on this chandelier here. These are point sources... these things right here. It's something that we can see as a point and our eyes, when we focus our eyes on something like that, we resolve those points into points, and that's what we call end focus. And how that works in a camera, is a point source of light, like those little things, it actually expands and when the lens captures that, the lens is gonna focus that point source of light over here and where these two... where that cone of light converges, that is called a focal point. Okay? So where this expands, it comes back. This is called the focal point. And then it just keeps going. Alright, that light just keeps going. So, what we want, is we want this focal point to hit what is called our plane of focus, our focal plane. The focal plane is our sensor in our camera, or the film, if we had film. So, what we wanna do is we wanna have this focused on that. That's how we achieve focus. That's how focus works. And what happens is, different point sources of light at different distances will have different focus points. So that's why when you're racking your focus and you're moving that little thing, you're moving your lens back and forth, and so what happens is if you move it back, now, this isn't focused, because we've moved our lens to focus back here, or if we move it forward, it's up here. So that's how we can choose what's in focus. How does that apply to depth of field and Bokeh? Well, this little cone right here, if it's resolved exactly on the sensor, we get something that's clear and it's in focus. But, sometimes it doesn't have to be exactly right. So, if the focal plane is just off a little bit, we can still identify that thing as a point, even though it's not as crisp as it could be, it's within the circle of confusion. In other words, it's inside of this thing that we accept as in focus. So usually, that is just a little bit before or a little bit behind this point. And that's how we get depth of field. So we have things that are at different focal lengths. They're not all gonna resolve in the same place, they're gonna resolve at different places, but if they're inside this little area that our brains cay say, "You know what? That's not exactly a dot but it's pretty close," then we consider that in focus. It's pretty crazy. And if we have something like this where we close the aperture down, what happens is these converge in a much more narrow way which means our circle of confusion gets greater and our depth of field grows. So that's why when you're shooting at something like at a 2.8, really wide aperture, a 1.2, the depth of field is really shallow, because that convergence is like, wham, it's just coming in and your circle of confusion is really, really narrow and the opposite when we take this aperture and we close it down, more of these little dots, even though they're not exactly in focus, will appear to our eyes as being in focus and that's really cool. Make sense so far? Awesome! Okay, now, why does this matter? It matters because the light is bending through the lens and that is a very important thing because, remember, light travels in red and green and blue, right? Well guess what? Those are not the same wavelengths. Think of them as radio stations. Radio stations, radio signals are wavelengths, right? So if you're selected at 107.5 FM, you don't hear 109-whatever, you only hear that one, right? Because the wavelengths are different shapes. The same thing is true of light. And what happens is... by the way, if you really wanna get crazy, look this up on Wikipedia. And this is creative license so I got permission. Chromatic aberration works like this. Red, green, and blue are coming in to the lens at the same place, but as soon as they start bending... Look at this, they show up at different points and that's a problem because what that means is when you're bending light through the lens you will have red and green maybe in focus, but the blue is not. So you'll have a color that's out of focus. And you can see this if you have a lens that isn't so special. When you have something in focus and you'll see a purple fringe around things. And the reason that's happening is because two of these colors, usually red and green, are in focus, but the blue is off. It's not falling on the focal plane correctly and if the circle of confusion for that color is really small, you're gonna have this thing called chromatic aberration. This happens with cheap lenses all the time. So, lenses that are a little bit more expensive, what they do, is they don't put up with this. They have things called lens elements and the lens elements are like this. So you have this thing that's called the crown and this thing that's called the flint. And what the flint does, is it corrects for the misshapen colors coming through so everything lines up pretty close to on-target. So that's why you have a zoom lens. A good zoom lens is usually really heavy because not only are you focusing and that's moving, but you're zooming everything in and out, so the science of getting all of this light to fall in the same place is very, very complicated. And if you have a really nice lens, you can't just use normal glass because normal glass reflects light in weird ways, so you get really massive chromatic aberration, and this is a chromatic aberration. So you have to correct all of that so you have more and more lens elements. That's why lenses are so heavy and why they're so expensive, because it's not just glass, it's like crystal, it's really expensive stuff. It's really cool. Okay, we have four minutes for questions. I can't believe we made it through all of that so far. Yes, what questions do we have?
So, can you talk a little bit about the relationship in distance of light from the subject versus power or number of lights and how that relates to stops?
Yes, we're gonna do that, I believe tomorrow. There's a principle called the inverse-square law and we're gonna dive deep into that, but we have, I think, an hour on that, so I'll answer that tomorrow, or maybe this afternoon, I can't remember. It's coming. Yeah, the inverse-square law stuff, we're gonna take a break from all the science-y stuff and do some more shooting for a little while and then we're gonna come back and do distance from subject and how the light falls off and why and how you can change that from point source to... yes, it's all coming. It's coming!
And Mark, this one is from earlier from "Milehighguy." "Question, the smaller the sensor, does that mean the worse the dynamic range?" So then, little point-and-shoots should have a terrible dynamic range. Is that correct?
That is correct. You can try it out yourself. Get your camera phone. Take a picture and take a picture with your DSLR and just look. And the thing that's crazy is... I've got a bunch of friends that work at Intel and those places and I used to work there. Technology for working at the nanometer, kind of like the way, way, way down, small.... where chips need to be built has increased significantly in the last five years or so. In fact, they're having problems building chips now because they're working at such a small level that the light coming through the devices is bending and they can't really see what's down there, so you need, like, an atomic microscope to do the stuff they wanna do now. But, what's happening is, those sensors have improved significantly in the last, probably, five years, so if you take an iPhone 1 compared to an iPhone and see what the image quality looks like, it's dramatically different. So, in general, yeah. A smaller sensor is always gonna have worse dynamic range than a larger sensor, but the small sensors have improved significantly. Yeah, there's the difference between CCD sensors and CMOS sensors and that's maybe color theory. Anyway, yes, what's the next question?
Comment from Marty who said, "That was awesome. Great tutorial on chromatic aberration." Very cool, and I think we have a question in the audience.
On the chromatic stuff here, is there an optimal setting for each lens to get less of that?
There is actually, and this is really... I'm getting bonus points for you. (laughing) If you... okay, we didn't get to talk about this with the ColorChecker Passport, but with the ColorChecker Passport, what that does, is it builds a color profile and Adobe has done a phenomenal job of calibrating lenses to... Adobe has built all these profiles out to know exactly what the chromatic aberration is for almost every lens made. So you wouldn't really set it in the camera. But in Lightroom... Let me see if I can go down here and find this. There is a camera calibration setting and you can, you can change the profile, so these are color profiles. You can do that, but then there's also a lens correction setting and the lens correction is up here and if I say, "Profile, enable profile correction," notice that it knows, in Lightroom, what model of lens that I've used. So what that does, is it not only helps with chromatic aberration, it also helps with lens distortion. So, that's one of the big benefits of Lightroom, is that Adobe constantly does all these calibrations to fix color and distortion. So if I undo these, see the edges there, how that's changing?
So why wouldn't you just have that checked all the time?
Some people, like myself, buy lenses specifically for the deformities they create. It's sort of like shooting with a certain type of film because it's really highly saturated or has a nice film grain or something, so it's like a signature look. If you're shooting with a Holga or a Lensbaby, or those kinds of things. you're actually messing up perfect optics and throwing off color to achieve an artistic look, so sometimes, a lot of people will uncheck that because they don't want correct images, they want artistically correct images. So that's why. You're a Mr. Adobe, is that right?
It's sort of like what we talked about with color temperature.
You can have correct color or you can have pleasing color
So, even with the lenses, if you want...
There's correct, scientifically correct, and then there's artistically correct. These over here, we can have them all look white, which is scientifically correct, and we don't want that. We want them to look orange and warm. The same thing is true of chromatic aberration and those types of things. You'll see if you're shooting and you have really nasty chromatic aberration, there's, like, this blue thing around everything and then you get sad and buy a new lens, so that's what happens there. Alright, more questions.
Yeah here's one from "Dland." "How does chromatic aberration apply to prime lenses? Do cheaper prime lenses have issues with chromatic aberration like a zoom would?"
Prime lenses traditionally do a much better job with handling color because there aren't all the moving parts. I don't know enough to say, "Yeah, a cheap prime is gonna be equivalent to an expensive zoom." It really depends on the prime and the zoom. But, generally speaking, a prime lens is gonna do better with focusing color than a zoom lens. In fact, years ago, most photographers would only work with primes because of that, and most cinematographers only worked with primes because of that. In fact, I don't know of any cinematographers who are working with zoom lenses. Yeah, the newer zooms as of, maybe, 10 years ago and newer, are leaps and bounds above what used to be able to have happen. It's mainly because of manufacturing processes and the ability to do really finite control in technology. It's all that science-y stuff. Alright, what's next?
I just wanna say the comments coming in about you, Mark, and the fact that people are understanding things for the first time that they've been trying to learn for years. A lot of those. Alright, well, can we go back to the question about the ColorChecker that we weren't able to ask before? Do you mind?
Okay. "Geckoboy," who is from Phoenix...
Ha, yeah, I know him.
Sounds like you do. ...Said, "Is there a way to tell that you have a good photo of the ColorChecker on location? I recently used mine and when I got home, the ColorChecker looked good, but Lightroom wouldn't make a profile. Another image of it that seemed to be less ideal worked." So any tips on that?
Yeah, Geckoboy and I used to play video games everyday at lunch for years. Yeah, the answer is that we... Can I see the ColorChecker? Yes. The problem with this, there's two things that are gonna get you. When you have this, if it's not... it needs to be at a 90 degree to the camera as much as possible and level. So if this is off and skewed a little bit, even though you have it in, what will happen is, to create the profile, which we weren't able to do yet, it can't see the target accurately, it's not square. By the way, Geckoboy works at Intel. It has to be square where you can see all four corners. So maybe the one that you thought was good was just skewed a little bit. The other thing that will happen with this in bright sunlight, is it will cast, it will have a glare. If there's a glare on it, you're done, that won't work as well, so it needs to be flat to the camera as much as possible, 90 degrees, and then it needs to be square, not skewed at all. So my guess is, either it was skewed or it had a glare. The other thing is, it needs to fill up a good chunk, three quarters or so, of the screen. The one thing that you can do, though, is there is a ColorChecker utility to create profiles and you can manually go in and tell... you can tell the utility where the corners are. We're gonna say it's here, here, here, and here. This one was iffy on the positioning 'cause it was just a little bit off. Also, it needs to be right-side up. That's the other thing. The colors on the bottom, and the people and the calibration on top. That's how it should be. So that's my guess.