Skip to main content

Creative iPhoneography

Lesson 4 of 24

New iPhone 5S

 

Creative iPhoneography

Lesson 4 of 24

New iPhone 5S

 

Lesson Info

New iPhone 5S

This is from Steven s. Who asked, Can you ask Jack about the just announced Apple IPhone five s with the upgraded camera? Do you consider the upgraded camera worth upgrading to the newer phone, which is the IPhone five s and Iowa seven camera improvements? And is it worth anything? So, yes, whoever that person who asked that question is yes, yes, let's do that. Actually going to step out one of things I think that I would like to do. I want to take this a little component of the class and separated out, and I want to blast it on YouTube all around the planet. Because there are. It's a huge question right now, especially since the cameras resolution of the same. So right now is the introduction of Jack Davis is IPhone five s and IOS seven camera improvements. Overview for everybody out there and in TV land. Okay, get two days. Apple announced the new IPhone five, and as you as you saw, there were people already waiting in line to buy it, you know, last week. So those people really need ...

help. I, uh but, um, very, very great questions. So let's just jump right into it. First off the new IPhone five houses using an A seven processor with 64 bit architecture, blah blah, blah, blah, blah. That's amazing. It's the first mobile device that's using a 64 bit bus structure. That ability gives it a huge bump in processing power, and Apple is using that in the camera and the processing tremendously. So even if they did nothing other than update the chip, that's gonna affect the photography. But it's huge. It really is the processing power of in a 64 bit. Architecture is awesome, and depending upon what you're doing, it's a 500 to 200% increase in terms of speed. But Apple is going to do all sorts of things with that processor. The lens on it is a five element F 2.2 lens. So by opening up the F stop having a faster piece of glass, a swell is continuing to. If I've refined the elements within it, that's awesome. As we all know, that's the expense of the lens is how fast it can be, how big of an aperture can you have and yet maintain focus. That's why a you know, 1.8 lens cost 10 times as much as a 36 lens or something like that. So it's that is huge for them to do that, and I'll start off in the beginning with this is yes, it's an eight megapixel, just like the old one was megapixel. But the people who say What's the same camera? Because it's that both eight megapixel nothing could be further from the truth. It is a completely different camera and completely different software running that camera, Um, and that's actually just jumped back up to that process for me. I don't know about you, but my cameras filled up with photographs constantly. I've got the 64 gig version of it, and it's still constantly filled up because I shoot so bloody much. If Apple went from 8 to 12 megapixels and did a 50% increase, your obviously can hold 50% less photographs. This is an issue with the android. You got a 20 megapixel or a 40 megapixel camera. That's really cool, and that was 40 megapixel cameras that you can zoom in by cropping. But I know that people who have those cameras and they go? Yeah, there's a little bit of an issue as I could take five pictures, you know, And then I'm done. No, that's being a little facetious, but, um, I think what Apple is done by sticking with the same resolution. Most of us don't have a problem with eight megapixel. When you look at the images, they're gorgeous. They do everything that we need. We can still do even large format prints from it. What we want is the cleanest image of those eight megapixels. But I want to be able to shoot my head off and not hesitate because my camera roll is going to get filled up so I don't mind. Eight megapixel. And I actually see why Apple Wit stayed with eight because I think it's a much more practical size for mobile photography. 20 and 40 megapixel files for me is not a practical file size for mobile photography. Not if I'm shooting my bloody head off and I want to shoot. You know, shots of this little snail. You know, I want to experiment with the 10 shots I wanted. Then replicate that I want to put that into a piece of software Once you go into an app, you have a 40 megapixel original file, and then you try and throw that into a snap seat or something else like that. It won't work. Everything comes to it to a screaming crawl. So as cool as it is to pump the number numbers up and you may have seen this in the Apple keynote, I think they're going is what can we do to extend the quality of the images rather than the quantity of pixels is exactly what they went for. And I think for me that is perfect. That's exactly what I would want. Could you make the camera freaking Lee Mawr Awesome. And I don't need for my mobile photography more pixels. If I do need more pixels, guess what? I can actually shoot 28 megapixels with Panorama feature built into it. So if I need more or I can go up and use their APS, you know we're talking about clear camp. Clear Can will shoot a toe, 18 megapixel shot using their enhance mode, which is awesome. So if you got the money shop on an IPhone, you're not limited eight megapixels. You can shoot Panos. They'll shoot over 20 megapixels. You can use things like, um, clear camp that actually shoot six shots and then interpret them up. You also have, you know, big camera and other scaling APS that bump up if you need a high res image and you, of course, you have Photoshopped photo shop. Sisi has increased their raising up feature. So anyway, I think the eight megapixels is absolutely fine. But let's continue on with this. The sensor The sensor is 15% larger. It was in the past is now 1.5 microns. In terms of the sensor locations, they call it bigger pixels. Pixels are actually that were used to make up the image after it's been captured, so they're really sensor locations there. I mean, I guess that the term has changed. You didn't those those weren't. Those aren't called pixels on the sensor. Basil is what it makes. It grabs something, and then those air eventually turned into pixels. But anyway, the larger sensor means that the sensor locations are actually 15% larger than they what they were before, and it compares in the most smartphones. Those which can go down to typically one micron. That's dramatic. You know, 1.5 verses, one terms of the size of those means much greater amount of light coming in. And that bigger sensor location of big, bigger pixels, as they call it, means more light, greater dynamic range in less noise. And if there's three things that I want, I want more light. I want a better, you know? Okay, in terms of my depth of field, I want greater dynamic range. I want less noise. That's exactly what I want. Bigger sensor, Bigger F Stop is exactly what they should do. That's what they did. You take the faster lens, plus the larger sensor, and now you have 33 more usable like what more could a photographer one? That's it. That that's what I want. Could you give it to me? Oh, you gave it to me. Great. So the person that says that the eight megapixel is the same as the old one? No, it's totally different, and it's exactly what I think they should. Yes, I had a question. I've got the IPhone five, and you might have noticed. I don't know if all the IPhone fives have this, but There's a coating on the lens that when I shoot in color and I'm shooting towards the light often I'll get this purplish hue on the edges of the of the image. I'm curious if you know, if that might have been changed. Um, I'm not sure that's a very good question. Most people would say that's being brought about, not by quote unquote coding, but by the sapphire lens that's on top of it. So it actually has that super hard, you know, piece of glass on top of it, and I've not seen in the description of the lens. The sapphire is over our new thumbprint sensor, but I have not seen whether it's still there, and whether that element is there, I would think, because that was a knish you with the five shooting directly into the light, we can sometimes get that not a big issue, but I would think because it was an issue that they would have done some tweaking to it. I don't know what they could have done since they redesigned the five elements that make up that the light bouncing around within this element is where you get these potential refraction issues and, you know, light bouncing back and forth. So it's a very good question. I don't know on that one. Good question. Um, the IPhone five s specific photo software that is now capable because of the A seven chip. So again, going back to the processing power, which is huge, does a few things, and I have some I took grabs from the keynote, but one that it's doing white balance on starts. It doesn't wait for you to do a tap for focus. It's automatically doing white balance, and as you'll see in a second, they've done a lot with White. Balance is special with Flash. They're doing a dynamic local tone map for every single image. So they've got this huge amount of horsepower that basically desktop processor in the cell phone, and that's going to allow them to go into setting, creating these what are known as a local tone map of the images and map out highlights and shadows maintaining shadow integrity, not blowing out highlights the fact that they're doing that on the fly. Awesome, Fantastic, great, Fantastic Auto focus. Major meeting or fish, 15 zone matrix meeting on it for autofocus. Awesome they've had facial recognition and everything else in it. It now has built into it facial recognition, smile recognition, you know, sharpness recognition. That's this next one right here, the every single shot, and I take it that you're gonna be able to turn this off. But I haven't worked with software, but every single picture you can have will do what clear Cam does right now, which is shoot three shots, analyzes it at a pixel level and only saves the sharp one of the sharpest one. Which, of course, the cool thing about that which is even better than any DSLR, is a compensates for both camera, shake and subject. Shake right. There's no DSLR on the planet that compensates for subject shake. It's all about cameras, all about the gyroscope, in the lens or in the body of the camera. For it to analyze after you already shot it and only save what sharp. That's awesome. That's fantastic. That's the little you know, Daisy. You're shooting in the wind and you take the picture, and it's only going to save the sharpest one out of those, and it's been doing it since they can shoot 10 frames a second. That means that it's shooting this again in a fraction of a second, and it's all behind the scenes and you don't even know it. So the fact that it's doing that that helps in low light, it helps on just overall sharpness. Um, I think that's an amazing thing to put that in. Is the default setting the new true tone? Flash this again? The world's first variable color flash? You know, two flashes, a cool in a warm and the ability to balance those out based upon this dynamic white balance on start. So you have over 1000 different white balance color variations within the flash. There is no flash that has a variable color temperature. You know you can put jells on top of your flashes and do you know all sorts of things. But for you to have 1000 different color options instantaneously, that's pretty darn cool. I think that's a really, really cool advancement. Hopefully, all of these things are going to be part of the software developer kit, so freaky developers like you are going to be able to go. Well, I know what to do with that. I'm gonna take this, you know, 10 frames a second and that motion blur thing and you low light auto image stabilization. The Cali An image stipulation It's not. It's multiple shots. It looks to me like it's four, and it's taking those to combine them again. It's realizing it's low light, and it's gonna combine those keeping the sharpest parts of the image. And again, this is kind of really weird science that say, Not only am I gonna take multiple images, not only am I gonna pick the sharpest one, but I noticed that this part is sharper than this part of sharper than this partnership part. And I'm gonna take these different elements of it and combine them and create a sharp image where there wasn't one before, again taking revenge. The fact that I have this incredibly powerful computer at their disposal to do this Franken stony in creation in the camera pretty darn cool, incredibly cool. So again, the samples and that is gonna be natural light versus using the flash. So the flash with improved their low light shooting is gonna be significantly improved because of this technology. Not only the fact that you're gonna be able to shoot at a much higher or much lower I S O. Because you have got a faster, bigger, bigger pixels and a faster land so you won't need it is much you'll be able to shoot in lower light and then when you can't shoot because it's so low, it has ways to compensate for that. The burst mode 10 Prime per second Unlimited seems like that. And it full rez, That was one My questions Is that you dropping this down to, you know, three megapixel to get this which some lot of acts to the samples, which they had online. And then I'll show you hear their full reds. They're exactly the same Rez at 10 frames a second, you know, unlimited. They're saying unlimited. You know, in terms of that, that buffer, I would think, has got it. You know, Philip, at some point, but maybe not, no matter what. The sample that they used is 20 frames in two seconds, and it's gorgeous. They're beautiful. Show you one of the samples in a second. So and also with that if you are using it to capture the dog in motion or the kid playing it has this auto suggest where it's looking at sharpness, whether there's a face in it, whether there's a smile in it and it will suggest what they think is the best frame out of that sequence. So when you press and hold down the shutter and it starts shooting, it puts it into what's known as a burst mode. And now it's gonna think of those images in a different way. They're still in your camera. Will you have access to all of them? Or you can just say, You know, I think this is I shot a sequence because I wanted one. I wanted the mid air. I don't need everything else filling up my camera roll. Great. Another big problem is how do I go through 10 billion images. So auto suggest, Is there also with sequence? If you're doing a sequence and they see that there's motion going on it, they suggest multiple frames that look be the the peaks of motion within that sequence. Again. Awesome, cool, groovy, Awesome. If those available to the software developers, what are they going to do with that ability to shoot unlimited frames? Four. Reds that mean begin in terms of imitating motion blur and things like that. A slow shutter speed. I think people are gonna go crazy with that slow motion video 120 frames per second at 7 20 p. Awesome. Great. Especially that they built in the software that would let you isolate the elements that you want in the slow motion and with ones you want in regular motion again. That's the practical application of it. So one it would have been great if you shoot it 1 it slow motion and then you bring it into another application. You have to do something else with it. So for Apple to say, not only can you have this really cool thing of slow motion but the really practical element of it, that is, the person swings the bad. They come up, they do the regular swing and it comes into contact my office and it goes to 1 20 you know, and then it goes through the follow through back at 30 frames a second. People like you were going to take that and take that up to 1000 frames a second, because with frame interpolation, you're gonna take that true 1 20 frames. It's true, it's not fake. Its true meaning, it really is 100 20. It's not interpreted from 30 which is what's being done with software right now. That's gonna take once you have true 120 frames per second. You can take that up to 1000 frames per second. You can interpret it and get the ridiculous water drops moving on a splash, as you're probably seeing in current video software. Fantastic. And the panel that they've done is now 28 megapixel maximum, which means that the whole architecture of the whole IPhone OS is going to allow for 28 megapixels. We're gonna be shooting those 28 megapixels all the time. So that means that software, the architecture of the camera roll APS, all the software developer kit that the apse have are gonna be able to have 28 megapixel be much more of a default setting, which means bringing things in from RSL, R and R Micro. 4/3 is gonna be much easier with that processing power of the A seven chip where now will be able to work with 28 megapixels and not have to worry about it. It's amazing. So again, is it weird that the five C is such a cute plastic? It's the same eight megapixel. First up Don't wine. It's not attractive. And two, it's a completely different animal without a seven chip just because it's there both eight megapixel. They're completely different animals. OK, but in general, the IOS 74 the five C, which is the colorful or cheap whatever gonna call it and all the cameras going back. I think the IPhone four going back to the IPhone 44 s. All those are gonna be able to update to Iowa. Seven. They have the new organizing within the camera roll. They have collections by both location and year, and you can zoom in on these. So the whole way of working with the camera is gonna be updated. For all users of an IPhone, you have the live affects the little instagram live effects for shooting or in post processing. You have a new square format for shooting options. So cute again, Instagram is taking over the world. I love instagram, an air drop for device device sharing. That's gonna be cool. You can share with anybody at any time over a shared network, and that will also help with shooting. With your IPhone and Senator IPad so airdrop it will be another options. There's tons of options for sharing will be covering that class. So those are some of the main things related to the new Iowa seven. Here are some samples or some images from the keynote, and this is their description of what it's being doing on the fly instantaneously behind the scenes auto, white balance, auto exposure, dynamic local tone, map matrix, meeting for focal points and picking the sharpest of multiple shots. The fact that it's doing that every single time you press the shutter up to 10 frames a second. That's freaky. Cool, if that's what I wanted. Yes, what do I want? That's what I want. You gave it to me. That's what I got totally happy. I'm totally happy. You know, the first thing when they said eight megapixel rather than 12. It's like huh? I want 12 when I thought about it, you know. No, I really don't want 12. I'm already, you know, don't have enough, um, space on my camera. It's fine if it's if it's anyway. Thistles there there shake reduction for a low light where they've taken. In this case, the demo shows four shots for different quadrants, and if you look at her thumb, there's a slight movement in the thumb. But it's so it's obviously taking multiple shots. The expression is her face. The thumb is a different one. From here, they combine multiple shots to give a sharper image in a little light. Very cool. We'll see how it actually works. I think it's a great idea if you if you had unlimited horsepower and you go, how are we going to do it? That's a great thing. Yes, thank you. Sharp. The new flash 1000 variations with warm what they call Amber and not awesome. Great, Fantastic. The results of that for you to be able to have a warm light for artificial light, white balance conditions, great potential. The fact that you have to led is at your disposal. I think you're gonna have more control of also the amount of light, the fact that you're controlling it. They obviously have complete control over intensity of those two L E. D's so that much control available at a software developer level for programming. Awesome. We're gonna see all sorts of things done with that and the megapixel. The one thing that I absolutely love about this because I love panoramas is the fact that it is doing 30 calculations per second to change the exposure. And if you've seen if you've analyzed how the IPhones native app shoots panels, it does not shoot. Shot, shot, shot, shot. It literally is shooting 30 frames a second. It's shooting continual foot photographs. Each one is a different photograph, which means, like if you have a moving wave traditional, every other Pano app in the world is going to shoot shot, shot, shot, shot. You can have wave here, wave here, wave here. None of them will will be touching with this app. It's one of the only ones because it's actually shooting little teeny tiny, irregular slices as you move it around. That wave is one continuous movement. It is a perfect movement of a moving subject, very cool, and the fact that it can look directly away from the light or, you know, having significantly different light scenarios is fantastic. I think that's gonna be a huge, huge boom for you to be able to look at the sunset and pan away and get the shadow detail away from the light as well as maintain the integrity of a sunrise or sunset. That's awesome. Right now. The only one that does that is photo sent from Microsoft, and it's doing separate images. And then it does really cool software to blend those. But it's having it's going. It's a compromise because it's saying Okay, dude, you're really getting greedy here because that's that. That's that. But it does. A great job isn't far from perfect. I think this is gonna be infinitely better even than Photosynth. So that is that. And in terms of the, um, samples that they showed, let's just go through a couple samples right here. And you all saw these in the keynote. So here is that the dynamic range they're shooting almost directly into the scent. You have white, you have dark. You have read, you have a very isolated red image. And when we, um, zoom in to the image that maintaining of highlight detail and maintaining shadow detail is fantastic on the image and again the noise structure on it is pleasing. I think it's great I'll open this up in they see are so you can see the clipping. But again, it's a great It's a great shot, you know, That's normally this sort of thing. You would be getting a, uh, great distinction between black hair and highlights, you know, and possibly the banding of things as close to the red, green and blue like a scion. This is the one of things that you saw yesterday. Let's zoom up to here is 100% you know, pretty sharpened of red again Any time you're dealing with true red, green or blue images where one sensor is needing toe carry most of the weight of representing an image. You have challenges. You all know that Redskin blow out. Just because of that, this is great. That's that is really good. And then here is the full rez. This is the actual for as file from one of those birth sequences that we that they showed in the keynote. But it's, you know, that's full rez, same as every other shot, you know. So that is Well, it's what I want from a mobile device. Is it my DSLR? No, but you know, cut it some slack. It's a freaking, you know, cell phone, but the fact that it's doing 10 frames a second at full rest I'm elated at that. My SLR can do that. And then this last one here, which I think is also pretty good, you're just you're getting just to the point of banding and out of gamut. But again, you're looking right into a sunset, and an SLR is gonna have challenges with that too. But it is backlit, and you are maintaining shadow detail in this, and you are getting a beautiful, you know, consistent grading in it. So the mapping on it, you know, in terms of looking right into the light is actually quite good. And if we look at something like the hissed a gram going into, uh, came a raw and looking at our history Graham here and seeing it's being clipped, but only in one particular channel. If you're used to this little future in a CR, that shows you clipping if you hold down your option or all key, we can see that even though it's being clipped in red and blue channels, it's not going white. If it goes pure white. It's been clipped in all three channels. There's actually is a huge, still amount of dynamic range before there's complete clipping in those highlights. So, um, and you can see that we can actually even take it down if we don't want any clipping in there. So that's huge. That's the by doing a little simple adjustment here were able to maintain highlight detail, so I think they're doing a great job. This this dynamic tone mapping, um, taking advantage of the chip, the bigger sense of locations and the greater F stop is going to give us great great options in there. Okay, that's a quick overview of the IPhone five ass and what I think is cool and groovy about it. Are there any questions from the studio audience or any questions for me along in terms of that, if the question is is aware that yeah, that's gonna cost me 2050 bucks of the cancellation cause I just bought the old one. Get over it at 250 bucks. You know, I did that for the last phone. Is it worth? You know, paying that penalty? How much joy do you get out of your mobile phone. It's like me. It's the cheapest thing you could ever do. It's like questions. Should have great photo shop a couple 100 bucks. Duh. How much pleasure? How much work do you get out of it? You know, you spend that in lots and everything else. We all know how much use. We all know how much you spend on the addictions You have just go ahead and do it. So even if you had to do the the contract issue both Sprint and T and T of this new way of working where you basically pays, you go and you can upgrade your phone every six months. Great. Get over it. It's the should be one of the greatest joys in your life. After this class, it will be so upgrade to it. Yes, get it. Give your old phone to your child. That's why you had Children, right? That's why you have a spouse. Give them the old phone. Turn it in, do whatever you want, get it. It's cool. It's groovy. We'll do another class related to the android fronts, which are really cool. I was looking at the you know the 40 megapixel captures from the, uh, our competitors out there Samsung Galaxy and all sorts of things. So, um, but I like what they did. The philosophy of keeping in the same pixel count, I think is awesome and improving everything around that and then letting you shoot a bunch of different ways with panels and high speed know that low light.

Class Description

Get ready to take jaw-dropping photos and videos using only your iPhone! Join award-winning photographer Jack Davis for a two-day introduction to the power and sophistication of the iPhone camera. Jack will teach you everything you need to know about using your iPhone to capture dynamic images, panoramas, and high-definition video. You’ll explore the benefits and storytelling opportunities that come from always having a high-powered camera at your fingertips. Jack will guide you through the use and selection of apps for enhancing and embellishing photos. You’ll also learn about sharing and archiving your photos using videos, slideshows, collages, social networks, web galleries, and more. Whether you’re a novice who uses an iPhone as your primary camera or a professional photographer who’s ready to leave your bulky camera at home sometimes, this course will give you the tools you need to take amazing photos on-the-go.

Reviews

Phillip Ziegler
 

Jack is terrific and there's a lot I learned watching the videos. Of course this is a fast-changing field so some things are dated--some of the apps no longer exist--but I highly recommend this course to anyone wanting a wide and pretty in-depth orientation to the world of Iphone photo apps.