Jeff Carlson Archives | Popular Photography https://www.popphoto.com/authors/jeff-carlson/ Founded in 1937, Popular Photography is a magazine dedicated to all things photographic. Tue, 20 Sep 2022 04:34:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://www.popphoto.com/uploads/2021/12/15/cropped-POPPHOTOFAVICON.png?auto=webp&width=32&height=32 Jeff Carlson Archives | Popular Photography https://www.popphoto.com/authors/jeff-carlson/ 32 32 How to unlock your smartphone camera’s best hidden features https://www.popphoto.com/how-to/unlock-smartphone-camera-app-features/ Tue, 20 Sep 2022 04:34:10 +0000 https://www.popphoto.com/?p=186412
Puget Sound grain terminal.
Jeff Carlson

Whether you're shooting Android or iPhone, here's how to get the most out of your device's built-in camera app.

The post How to unlock your smartphone camera’s best hidden features appeared first on Popular Photography.

]]>
Puget Sound grain terminal.
Jeff Carlson

What could be more fundamental to photography today than our smartphone cameras? They’re ever-present, ready in moments, and the technology behind them makes it easy to capture great photos in most situations. And yet, I regularly encounter people who are unaware of many of the core functions of the built-in camera app.

Smartphone camera fundamentals extend beyond just “push the big button.” Some tools help you set up the shot, and some give you more control over the exposure. A few are just plain convenient or cool. However, these features aren’t always easy to find. That’s where we come in.

iOS 16 vs. Android 13

But first, for these examples, I’m using the two phones I have at hand: an iPhone 13 Pro running iOS 16 and a Google Pixel 6 Pro running Android 13. I’m also focusing just on the built-in camera apps; for even more manual control, you can find third-party apps in the app stores. Many of the camera features overlap between iOS and Android operating systems, and it’s possible that some may not be available on older models, or are accessible in a different way. If you see something here that doesn’t match with what you see, break out the manual—I mean, search Google—and see if it’s available for yours.

How to quick-launch the camera

Most people perform the usual dance of unlocking the phone, finding the camera app, and tapping to launch it. By that time, the moment you were trying to capture might be gone. There are faster ways.

Related: Composition in the age of AI – Who’s really framing the shot?

On the iPhone’s lock screen, swipe right-to-left to jump straight to the camera app without unlocking the phone at all. You can also press the camera icon on the lock screen. On the Pixel, double-press the power button from any screen.

When the phone is unlocked, a few more options are available. On both phones, press and hold the camera app icon to bring up a menu of shooting modes, such as opening the app with the front-facing selfie camera active.

Screenshots of Apple and Google camera apps with shortcuts shown.
Press and hold the Camera app icon to display some photo mode shortcuts (iPhone 13 Pro at left, Pixel 6 Pro at right). Jeff Carlson

I also like the ability to double-tap the back of the phone to launch the camera. On the iPhone, go to Settings > Accessibility > Touch > Back Tap and choose Camera for the Double Tap (or Triple Tap) option. In Android, go to Settings > System > Gestures > Quick Tap > Open app and choose Camera.

Related: Outsmart your iPhone camera’s overzealous AI

How to use the volume buttons to trigger the shutter

If you miss the tactile feedback of pressing a physical shutter button, or if hitting the software button introduces too much shake, press a volume button instead.

On both phones, pressing either volume button triggers the shutter. Holding a button starts recording video, just as if you hold your finger on the virtual shutter button.

Hand holding an iPhone and pressing the volume button to take a photo.
Press a volume button to trigger the shot for that tactile-camera experience. Jeff Carlson

On the iPhone, you can also set the volume up button to fire off multiple shots in burst mode: go to Settings > Camera > Use Volume Up for Burst.

How to adjust the exposure & focus quickly

The camera apps do a good job of determining the proper exposure for any given scene—if you forget that “proper” is a loaded term. You do have more control, though, even if the interfaces don’t make it obvious.

On the iPhone

A water scene with focus held in the distance/
Press and hold to lock exposure and focus on the iPhone. Jeff Carlson

On the iPhone, tap anywhere in the preview to set the focus and meter the exposure level based on that point. Even better (and this is a feature I find that many people don’t know about), touch and hold a spot to lock the focus and exposure (an “AE/AF LOCK” badge appears). You can then move the phone to adjust the composition and not risk the app automatically resetting them.

A water scene with the exposure decreased.
Drag the sun icon to adjust the exposure without changing the focus lock on the iPhone. Jeff Carlson

Once the focus and exposure are set or locked, lift your finger from the screen and then drag the sun icon that appears to the right of the target box to manually increase or decrease the exposure. A single tap anywhere else resets the focus and exposure back to automatic.

On the Pixel

On the Pixel, tap a point to set the focus and exposure. That spot becomes a target, which stays locked even as you move the phone to recompose the scene. Tapping also displays sliders you can use to adjust white balance, exposure, and contrast. Tap the point again to remove the lock, or tap elsewhere to focus on another area.

A water scene with Google's exposure slider shown.
The Pixel 6 Pro displays sliders for exposure, white balance, and contrast control when you tap to meter and focus on an area. Jeff Carlson

How to zoom with confidence

We think of “the camera” on our phones, but really, on most modern phones, there are multiple cameras, each with its own image sensor behind the array of lenses. So when you’re tapping the “1x” or “3x” button to zoom in or out, you’re switching between cameras.

Whenever possible, stick to those preset zoom levels. The 1x level uses the main camera (what Apple calls the “wide” camera), the 3x level uses the telephoto camera, and so on. Those are optical values, which means you’ll get a cleaner image as the sensor records the light directly.

The same water scene, zoomed in using pinch-to-zoom.
When you drag the camera selection buttons, this zoom dial appears for an up to 15x telephoto increase. But if you’re not on the 0.5x, 1x, or 3x levels, you’re sacrificing image quality for digital zoom. Jeff Carlson

But wait, what about using the two-finger pinch gesture to zoom in or out? Or, you can drag left or right on the zoom selection buttons to reveal a circular control (iPhone) or slider (Android) to let you compose your scene without needing to move, or even zoom way into 15x or 20x.

It’s so convenient, but try to avoid it if possible. All those in-between values are calculated digitally: the software is interpolating what the scene would look like at that zoom level by artificially enlarging pixels. Digital zoom technology has improved dramatically over the years, but optical zoom is still the best option.

How to switch camera modes quickly

Speaking of switching, the camera apps feature many different shooting modes, such as Photo, Video, and Portrait. Instead of tapping or trying to drag the row of mode names, on both iOS and Android, simply swipe left or right in the middle of the screen to switch modes.

Two flowers at different views.
Drag anywhere in the middle of the preview to switch between shooting modes. Jeff Carlson

How to use the grid & level for stronger compositions

Whether you subscribe to the “rule of thirds” or just want some help keeping your horizons level, the built-in grid features are handy.

In iOS, go to Settings > Camera > Grid and turn the option on. In Android, you can choose from three types of grids by going to the settings in the camera app, tapping More Settings, and choosing a Grid Type (such as 3 x 3).

The grid on the iPhone, and a related setting called Framing Hints on the Pixel, also enable a horizontal level. When you’re holding the phone parallel to the ground or a table, a + icon appears in the middle of the screen on both models. As you move, the phone’s accelerometer indicates when you’re not evenly horizontal by displaying a second + icon. Maneuver the phone so that both icons line up to ensure the camera is horizontally level.

A close-up of a pink flower.
When the phone is held parallel to the ground, a pair of + icons appears to indicate how level it is. Line them up for a level shot. (iPhone shown here.) Jeff Carlson

How to control the flash & ‘Night’ modes

Both camera systems are great about providing more light in dark situations, whether that’s turning on the built-in flash or activating Night mode (iOS) or Night Sight (Android). The interfaces for controlling those are pretty minimal, though.

On the iPhone, tap the flash icon (the lightning bolt) to toggle between Off and Auto. For more options tap the carat (^) icon, which replaces the camera modes beneath the preview with buttons for more features. Tap the Flash button to choose between Auto, On, and Off.

On the Pixel, tap the Settings button in the camera app and, under More Light, tap the Flash icon (another lightning bolt).

A dimly lit night scene with an old car.
The crescent moon icon indicates the Pixel 6 Pro is using its Night Sight mode. Jeff Carlson

The Pixel includes its Night Sight mode in the More Light category. When it’s enabled, Night Sight automatically activates in dark situations—you’ll see a crescent moon icon on the shutter button. You can temporarily deactivate this by tapping the Night Sight Auto button that appears to the right of the camera modes.

The iPhone’s Night mode is controlled by a separate button, which looks like a crescent moon with vertical stripes indicating a dark side of the moon. Tap it to turn Night mode on or off. Or, tap the carat (^) icon and then tap the Night mode button to reveal a sliding control that lets you choose an exposure time beyond just Auto (up to 30 seconds in a dark environment when the phone is stabilized, such as on a tripod).

A dimly lit night scene with an old car.
The yellow Night mode button indicates that the current maximum exposure is set for 2 seconds. Jeff Carlson

Put the fun in smartphone fundamentals

As with every camera—smartphone or traditional—there are plenty of features to help you get the best shot. Be sure to explore the app settings and the other buttons (such as setting self-timers or changing the default aspect ratio) so that when the time comes, you know exactly which smartphone camera feature to turn to.

The post How to unlock your smartphone camera’s best hidden features appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The state of AI in your favorite photo editing apps https://www.popphoto.com/how-to/ai-photo-editing-apps/ Tue, 30 Aug 2022 19:43:41 +0000 https://www.popphoto.com/?p=184122
ON1’s Super Select AI feature automatically detects different subjects
ON1’s forthcoming Super Select AI feature automatically detects various subjects/elements (shown in blue), allowing user to quickly create an editable mask. ON1

From Lightroom to Luminar Neo, we surveyed the field and these are the most-powerful AI-enhanced photo editing platforms.

The post The state of AI in your favorite photo editing apps appeared first on Popular Photography.

]]>
ON1’s Super Select AI feature automatically detects different subjects
ON1’s forthcoming Super Select AI feature automatically detects various subjects/elements (shown in blue), allowing user to quickly create an editable mask. ON1

Artificial Intelligence (AI) technologies in photography are more widespread than ever before, touching every part of the digital image-making process, from framing to focus to the final edit. But they’re also widespread in the sense of being spread wide, often appearing as separate apps or plug-ins that address specific needs.

That’s starting to change. As AI photo editing tools begin to converge, isolated tasks are being added to larger applications, and in some cases, disparate pieces are merging into new utilities.

This is great for photographers because it gives us improved access to capabilities that used to be more difficult, such as dealing with digital noise. From developers’ perspectives, this consolidation could encourage customers to stick with a single app or ecosystem instead of playing the field.

Let’s look at some examples of AI integration in popular photo editing apps.

ON1 Photo RAW

ON1 currently embodies this approach with ON1 Photo RAW, its all-in-one photo editing app. Included in the package are tools that ON1 also sells as separate utilities and plug-ins, including ON1 NoNoise AI, ON1 Resize AI, and ON1 Portrait AI.

The company recently previewed a trio of new features it’s working on for the next major versions of ON1 Photo RAW and the individual apps. Mask AI analyzes a photo and identifies subjects; in the example ON1 showed, the software picked out a horse, a person, foliage, and natural ground. You can then click a subject and apply an adjustment, which is masked solely to that individual/object.

ai photo editing tools
In this demo of ON1’s Mask AI feature under development, the software has identified subjects such as foliage and the ground. ON1

Related: Edit stronger, faster, better with custom-built AI-powered presets

ON1’s Super Select AI feature works in a similar way, while Tack Sharp AI applies intelligent sharpening and optional noise reduction to enhance detail.

Topaz Photo AI

Topaz Labs currently sells its utilities as separate apps (which also work as plug-ins). That’s great if you just need to de-noise, sharpen, or enlarge your images. In reality, though, many photographers buy the three utilities in a bundle and then bounce between them during editing. But in what order? Is it best to enlarge an image and then remove noise and sharpen it, or do the enlarging at the end?

Topaz is currently working on a new app, Photo AI, that rolls those tools into a single interface. Its Autopilot feature looks for subjects, corrects noise, and applies sharpening in one place, with controls for adjusting those parameters. The app is currently available as a beta for owners of the Image Quality bundle with an active Photo Upgrade plan.

ai photo editing tools
Topaz Photo AI, currently in beta, combines DeNoise AI, Sharpen AI, and Gigapixel AI into a single app. Jeff Carlson

Luminar Neo

Skylum’s Luminar was one of the first products to really embrace AI technologies at its core, albeit with a confusing rollout. Luminar AI was a ground-up rewrite of Luminar 4 to center it on an AI imaging engine. The following year, Skylum released Luminar Neo, another rewrite of the app with a separate, more extensible AI base.

Now, Luminar Neo is adding extensions, taking tasks that have been spread among different apps by other vendors, and incorporating them as add-ons. Skylum recently released an HDR Merge extension for building high dynamic range photos out of several images at different exposures. Coming soon is Noiseless AI for dealing with digital noise, followed in the coming months by Upscale AI for enlarging images and AI Background Removal. In all, Skylum promises to release seven extensions in 2022.

ai photo editing tools
With the HDR Merge extension installed, Luminar Neo can now blend multiple photos shot at different exposures. Jeff Carlson

Adobe Lightroom & Lightroom Classic

Adobe Lightroom and Lightroom Classic are adding AI tools piecemeal, which fits the platform’s status of being one of the original “big photo apps” (RIP Apple Aperture). The most significant recent AI addition was the revamped Masking tool that detects skies and subjects with a single click. That feature is also incorporated into Lightroom’s adaptive presets.

ai photo editing tools
Lightroom Classic generated this mask of the fencers (highlighted in red) after a single click of the Select Subject mask tool. Jeff Carlson

It’s also worth noting that because Lightroom Classic has been one of the big players in photo editing for some time, it has the advantage of letting developers, like the ones mentioned so far, offer their tools as plug-ins. So, for example, if you primarily use Lightroom Classic but need to sharpen beyond the Detail tool’s capabilities, you can send your image directly to Topaz Sharpen AI and then get the processed version back into your library. (Lightroom desktop, the cloud-focused version, does not have a plug-in architecture.)

What does the consolidation of AI photo editing tools mean for photographers?

As photo editors, we want the latest and greatest editing tools available, even if we don’t use them all. Adding these AI-enhanced tools to larger applications puts them easily at hand for photographers everywhere. You don’t have to export a version or send it to another utility via a plug-in interface. It keeps your focus on the image.

It also helps to build brand loyalty. You may decide to use ON1 Photo RAW instead of other companies’ tools because the features you want are all in one place. (Insert any of the apps above in that scenario.) There are different levels to this, though. From the looks of the Topaz Photo AI beta, it’s not trying to replace Lightroom any time soon. But if you’re an owner of Photo AI, you’ll probably be less inclined to check out ON1’s offerings. And so on.

More subscriptions

Then there’s the cost. It’s noteworthy that companies are starting to offer subscription pricing instead of just single purchases. Adobe years ago went all-in on subscriptions, and it’s the only way to get any of their products except for Photoshop Elements. Luminar Neo and ON1 Photo RAW offer subscription pricing or one-time purchase options. ON1 also sells standalone versions of its Resize AI, NoNoise AI, and Portrait AI utilities. Topaz sells its utilities outright, but you can optionally pay to activate a photo upgrade plan that renews each year.

ai photo editing tools
AI-enhanced photo editing tools come in many forms, from standalone apps to plugins to built-in features in platforms like Lightroom. Getty Images

Subscription pricing is great for companies because it gives them a more stable revenue stream, and they’re hopefully incentivized to keep improving their products to keep those subscribers over time. And subscriptions also encourage customers to stick with what they’re actively paying for.

For instance, I subscribe to the Adobe Creative Cloud All Apps plan, and use Adobe Audition to edit audio for my podcasts. I suspect that Apple’s audio editing platform, Logic Pro would be a better fit for me, based on my preference for editing video in Final Cut Pro versus Adobe Premiere Pro, but I’m already paying for Audition. My audio-editing needs aren’t sophisticated enough for me to really explore the limits of each app, so Audition is good enough.

In the same way, subscribing to a large app adds the same kind of blanket access to tools, including new AI features, when needed. Having to pay $30-$70 for a focused tool suddenly feels like a lot (even though it means the tool is there for future images that need it).

The wrap

On the other hand, investing in large applications relies on the continued support and development of them. If software stagnates or is retired (again, RIP Aperture), you’re looking at time and effort to migrate them to another platform or extricate them and their edits.

Right now, the tools are still available in several ways, from single-task apps to plug-ins. But AI convergence is also happening quickly.

The post The state of AI in your favorite photo editing apps appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Bring on the noise: How to save high ISO files deemed ‘too noisy’ https://www.popphoto.com/how-to/ai-de-noise-software/ Sun, 14 Aug 2022 15:00:00 +0000 https://www.popphoto.com/?p=182365
A interior photo of a church with lots of visible noise from using a high ISO.
Crank that ISO, AI de-noise software is here to save the day. Shot at ISO 6400 with noise corrections made using On1 NoNoise AI. Jeff Carlson

Today's AI de-noise software is surprisingly powerful.

The post Bring on the noise: How to save high ISO files deemed ‘too noisy’ appeared first on Popular Photography.

]]>
A interior photo of a church with lots of visible noise from using a high ISO.
Crank that ISO, AI de-noise software is here to save the day. Shot at ISO 6400 with noise corrections made using On1 NoNoise AI. Jeff Carlson

In photography, knowing when to put the camera away is a valuable skill. Really dark situations are particularly difficult: without supplemental lighting, you can crank up the ISO to boost the camera’s light sensitivity, but that risks introducing too much distracting digital noise in the shot. Well, that was my thinking until recently. I now make photos I previously wouldn’t have attempted because of de-noise software, fueled by machine learning technology. A high ISO is no longer the compromise that it once was.

That opens up a lot of possibilities for photographers. Perhaps your camera is a few years old and doesn’t deal with noise as well as newer models. Maybe you have images in your library that you wrote off as being too noisy to process. Or you may need extremely fast shutter speeds to capture sports or other action. You can shoot with the knowledge that software will give you extra stops of exposure to play with.

Sensor sensibility

Too frequently, I run into the following circumstances. In a dark situation, I increase the ISO so I can use a fast enough shutter speed to avoid motion blur or camera shake. The higher ISO ekes out light from the scene by sending more power to the image sensor, boosting its light sensitivity. That power boost, however, introduces visible noise into the image. At higher ISO values—6400 and higher, depending on the camera—the noise can be distracting and hide detail.

The other common occurrence is when I forget to turn the ISO back down after shooting at night or in the dark. The next day, in broad daylight, I end up with noisy images and unusually fast shutter speeds because the camera is forced to adapt to so much light sensitivity. If I’m not paying attention to the values while shooting, it’s easy to miss the noise by just looking at previews on the camera’s LCD. Has this happened to some of my favorite images? You bet it has.

Incidentally, this is one of those areas where buying newer gear can help. The hardware and software in today’s cameras handle noise better than in the past. My main body is a four-year-old Fujifilm X-T3 that produces perfectly tolerable noise levels at ISO 6400. That has been my ceiling for setting the ISO, but now (depending on the scene of course) I’m comfortable pushing beyond that.

The sound of science

Noise-reduction editing features are not new, but the way we deal with noise has changed a lot in the past few years. In many photo editing apps, the de-noising controls apply algorithms that attempt to smooth out the noise, often resulting in overly soft results.

A more effective approach is to use tools built on machine learning models that have processed thousands of noisy images. In “Preprocess Raw files with machine learning for cleaner-looking photos,” I wrote about DxO PureRAW 2, which applies de-noising to raw files when they’re demosaiced.

If you’re working with a JPEG or HEIC file, or a Raw file that’s already gone through that processing phase, apps such as ON1 NoNoise AI (which is available as a stand-alone app/plug-in and also incorporated into ON1 Photo RAW) and Topaz DeNoise AI analyze the image’s noise pattern and use that information to correct it.

Testing various de-noise software

A interior photo of a church with lots of visible noise from using a high ISO.
Viewed as a whole, the noise isn’t terrible. Jeff Carlson

The following image was shot handheld at 1/60 shutter speed and ISO 6400. I’ve adjusted the exposure to brighten the scene, but that’s made the noise more apparent, particularly when I view the image at 200% scale. The noise is especially prominent in the dark areas.

A interior photo of a church with lots of visible noise from using a high ISO.
But zoom in and you can see how noisy the image is. Jeff Carlson

Lightroom

If I apply Lightroom’s Noise Reduction controls, I can remove the noise, but everything gets smudgy (see below).

A interior photo of a church with lots of visible noise from using a high ISO.
Lightroom’s built-in tool isn’t helpful when correcting. Jeff Carlson

ON1 NoNoise AI

When I open the image in ON1 NoNoise AI, the results are striking. The noise is removed from the pews, yet they retain detail and contrast. There’s still a smoothness to them, but not in the same way Lightroom rendered them. This is also the default interpretation, so I could manipulate the Noise Reduction and Sharpening sliders to fine-tune the effect. Keep in mind, too, that we’re pixel-peeping at 200%; the full corrected image looks good.

A interior photo of a church with lots of visible noise from using a high ISO.
Compare the original with the corrected version of the image using the preview slider in ON1 NoNoise AI. Jeff Carlson

Looking at the detail in the center of the photo also reveals how much noise reduction is being applied. Again, this is at 200%, so in this view the statues seem almost plastic. At 100% you can see the noise reduction and the statues look better.

A interior photo of a church with lots of visible noise from using a high ISO.
Detail at the middle of the frame. Jeff Carlson

Topaz DeNoise AI

When I run the same photo through Topaz DeNoise AI, you can see that the software is using what appears to be object recognition to adjust the de-noise correction in separate areas—in this case not as successfully. The cream wall in the back becomes nice and smooth as if it was shot at a low ISO, but the marble at the front is still noisy.

A interior photo of a church with lots of visible noise from using a high ISO.
Topaz DeNoise AI’s default processing on this image ends up fairly scattershot. Jeff Carlson

Bring the noise

As always, your mileage will vary depending on the image, the amount of noise, and other factors. I’m not here to pit these two apps against each other (you can do that yourself—both offer free trial versions that you can test on your own images).

What I want to get across are two things. One, AI is making sizable improvements in how noise reduction is handled. And because AI models are always being fed new data, they tend to improve over time.

But more important is this: Dealing with image noise is no longer the hurdle it once was. A noisy image isn’t automatically headed for the trash bin. Knowing that you can overcome noise easily in software makes you rethink what’s possible when you’re behind the camera. So try capturing a dark scene at a very high ISO, where before you may have just put the camera away.

And don’t be like me and forget to reset the ISO after shooting at high values the night before. Even if the software can help you fix the noise.

The post Bring on the noise: How to save high ISO files deemed ‘too noisy’ appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Edit stronger, faster, better with custom-built AI-powered presets https://www.popphoto.com/how-to/ai-powered-presets/ Fri, 29 Jul 2022 01:17:32 +0000 https://www.popphoto.com/?p=180624
A lighthouse photo with a purple filter applied
The editing platform Luminar Neo offers plenty of AI-powered sky replacement presets. Jeff Carlson

Good old-fashion presets are more powerful when combined with AI-assisted subject masking.

The post Edit stronger, faster, better with custom-built AI-powered presets appeared first on Popular Photography.

]]>
A lighthouse photo with a purple filter applied
The editing platform Luminar Neo offers plenty of AI-powered sky replacement presets. Jeff Carlson

It’s time to confess one of my biases. I’ve traditionally looked down on presets in photo editing software.

I get their utility. With one click you can apply a specific look without messing with sliders or knowing the specifics of how editing controls work. There’s certainly appeal in that, particularly for novice photo editors. And selling presets has become a vector for established photographers to make a little money on the side, or have something to give away in exchange for a newsletter sign-up or other merchandise. (I’m guilty of this too. I made some Luminar 4 presets to go along with a book I wrote years ago.)

I’ve just never seen the value in making my photos look like someone else’s. More often than not, the way I edit a photo depends on what the image itself demands. 

And then I saw the light: presets are not shortcuts, per se, they’re automation. Yes, you can make your photos look like those of your favorite YouTube personality, but a better alternative is to create your own presets that perform repetitive editing actions for you with one click.

For instance, perhaps in all of your portrait photos, you reduce Clarity to soften skin, add a faint vignette around the edges, and boost the shadows. A preset that makes those edits automatically saves you from manipulating a handful of controls to get the same effect each time. In many editing apps, presets affect those sliders directly, so if those shadows end up too bright, you can just knock down the value that the preset applied.

The downside is that a preset affects the entire image. Perhaps you do want to open up the shadows in the background, but not so much that you’re losing detail in the subject’s face. Well, then you’re creating masks for the subject or the background and manipulating those areas independently…and there goes the time you saved by starting with a preset in the first place.

Regular readers of this column no doubt know where this is headed. AI-assisted features that identify the content of images are making their way into presets, allowing you to target different areas automatically. Lightroom Classic and Lightroom desktop recently introduced Adaptive Presets that capitalize on the intelligent masking features in the most recent release. Luminar Neo and Luminar AI incorporate this type of smart selection because they’re both AI-focused at their cores.

Lightroom Adaptive Presets

A photo of a statue against a blue sky
An unedited image. Lightroom’s “Adaptive: Sky” presets let you adjust the look of the sky with a few clicks. And the “Adaptive: Subject” presets do the same for whatever Adobe deems to be the main subject, in this case, the statue. Jeff Carlson

Related: Testing 3 popular AI-powered sky replacement tools

Lightroom Classic and Lightroom desktop include two new groups of presets, “Adaptive: Sky” and “Adaptive: Subject.” When I apply the Sunset sky preset to an unedited photo, the app identifies the sky using its Select Sky mask tool and applies adjustments (specifically to Tint, Clarity, Saturation, and Noise) only to the masked area.

A photo of a statue against a purple sky
Only the area that Lightroom Classic identified as the sky is adjusted after choosing the “Adaptive: Sky Sunset” preset. Jeff Carlson

Similarly, if I click the “Adaptive: Subject Pop” preset, the app selects what it thinks is the subject and applies the correction, in this case, small nudges to Exposure, Texture, and Clarity.

A Lightroom mask on a statue.
“Adaptive: Subject Pop” selects what Lightoom believes to be the main subject of an image. Jeff Carlson

Depending on the image, that might be all the edits you want to make. Or you can build on those adjustments.

A Lightroom mask on a statue.
The final image with AI-powered presets applied to both the sky and the statue. Jeff Carlson

Related: ‘Photoshop on the Web’ will soon be free for anyone to use

Now let’s go back to the suggested portrait edits mentioned above. I can apply a subtle vignette to the entire image, switch to the Masking tool and create a new “Select Subject” mask for the people in the shot. With that active, I increase the Shadows value a little and reduce Clarity to lightly soften the subjects.

A photo of a couple
Increasing Shadows and bringing down Clarity brightens and softens the subjects’ skin in this portrait. Jeff Carlson

Since this photo is part of a portrait session, I have many similar shots. Instead of selecting the subject every time, I’ll click the “Add New Presets” button in the Presets panel, make sure the Masking option is enabled, give it a name and click Create. With that created, for subsequent photos I can choose the new preset to apply those edits. Even if it’s a preset that applies only to this photo shoot, that can still save a lot of time. 

Lightroom presets
Select the Masking option is turned on when creating a new adaptive preset, since by default it’s deselected.

Luminar Presets

When Luminar Neo and Luminar AI open an image, they both scan the photo for contents, identifying subjects and skies even before any edits have been made. When you apply one of the presets built into the apps, the edits might include adjustments to specific areas. 

A lighthouse photo
Luminar Neo offers a variety of sky-replacement presets. Jeff Carlson

For an extreme example, in Luminar Neo’s Presets module, the “Sunsets Collection” includes a Toscana preset that applies Color, Details, and Enhance AI settings that affect the entire image. But it also uses the Sky AI tool to swap in an entirely new sky.

The portrait editing tools in Luminar by default fall into this category, because they look for faces and bodies and make adjustments, such as skin smoothing and eye enhancement, to only those areas. Creating a new user preset with one of the AI-based tools targets the applicable sections.

A lighthouse photo with a purple filter applied
The Toscana preset in Luminar Neo is a good example of how a preset can affect a specific area of the image, replacing the sky using the Sky AI tool. Jeff Carlson

Preset Choices

The Luminar and Lightroom apps also use some AI detection to recommend existing presets based on the content of the photo you’re editing, although I find the choices to be hit or miss. Lightroom gathers suggestions based on presets that its users have shared, grouped into categories such as Subtle, Strong, and B&W. They tend to run the gamut of effects and color treatments, and for me that feels more like trying to put my image into someone else’s style.

Instead, I’ll stick to presets’ secret weapon, which is to create my own to automate edits that I’d otherwise make but take longer to do so.

The post Edit stronger, faster, better with custom-built AI-powered presets appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Does bird detect AF really work? https://www.popphoto.com/how-to/bird-photography-autofocus/ Wed, 13 Jul 2022 20:07:22 +0000 https://www.popphoto.com/?p=178580
Common Tern.
Common Tern. ISO 1250, 1/2000 sec, f/6.3, @ 600mm. © Marie Read

Flagship cameras like the Nikon Z9 and Sony a1 now include bird-specific autofocus modes, but are they effective? We investigate.

The post Does bird detect AF really work? appeared first on Popular Photography.

]]>
Common Tern.
Common Tern. ISO 1250, 1/2000 sec, f/6.3, @ 600mm. © Marie Read

We like to say it’s not about the gear, that a photographer with any camera can create good photos. And in most situations that’s true.

Then there’s bird photography.

To get the highest quality photos of feathered subjects you need special equipment: zoom lenses with a lot of reach, plus (ideally) image stabilization to compensate for the exaggerated movement caused by being zoomed so far in. You also need a camera body with a focusing system that can lock onto such dynamic aerial targets.

Those systems are reinforced with AI technologies. Some high-end cameras now even feature Bird AF (autofocus) modes that identify and track birds, some with priority for focusing on avian eyes. Examples include the Nikon Z9, Sony a1, Fujifilm X-H2S, and OM System OM-1 bodies. Other models, like the Sony a9, include Animal AF modes that look for animals and birds of all types.

These use the same detection principles as Face and Eye AF modes, but with the added complexity that birds and animals move faster than your Uncle George. An eagle yanking fish out of a creek for dinner isn’t going to stop and pose, or come closer to the camera if you ask it nicely (then again, George might not either). But with the right equipment and AI assistance, you can capture that moment with tack-sharp focus.

A golden eagle.
A golden eagle. ISO 500, 1/320 sec, f/4.5, @ 200mm. Daniel Hernanz Ramos/Getty Images

Related: Tips from the pros: 3 keys to taking better bird pictures

How it works

A number of elements go into making Bird AF and Animal AF work well. On top of the core autofocus systems, the camera needs to understand what it’s looking at. The camera manufacturers’ developers feed thousands of photos containing birds and animals (and cars, planes, and other objects on some systems) to the autofocus software, and train it to recognize similar visual patterns.

The software also requires fast hardware to process what the camera sees in real-time. The image sensor absorbs a frame of the incoming light information, passes it off to an image processor that determines if anything in the frame matches the objects it understands, and then directs the lens mechanisms to adjust the focus. All of that happens in milliseconds. Then the sensor sends a new frame’s worth of data and the process is repeated so you get real-time tracking and focus lock for when you decide to press the shutter button.

Bird AF in the wild

The technology is impressive, but how well does it work in the field? I reached out to two photographers I know to get their perspectives. Hudson Henry shoots all sorts of subjects but recently returned from a workshop in Costa Rica where he hauled a Nikon Z9 and an AF-S NIKKOR 800mm f/5.6E FL ED VR into the jungle to capture birds, monkeys, and other elusive inhabitants. Marie Read is the author of the book Mastering Bird Photography (Rocky Nook).

When I asked Henry about his experience on his trip, he replied by email, “I can tell you [Bird AF] worked just shockingly well, with the Z9 picking up just about every bird’s eye I had reasonably sized in the frame.”

Occasionally, the feature was spoofed by areas that were similar to bird eyes. “There were lizards with big spots on the sides of their faces that fooled it,” he says, “and butterflies that had eye-like markings that wanted to lock as eyes, necessitating single point selection at times. But all-in-all it was just shockingly good on a wide array of birds and wildlife.”

Nikon Z9
In addition to a standard subject tracking mode, the flagship Nikon Z9 also includes a bird tracking mode. Nikon

Read’s experience is with a Sony a9, which offers Animal AF, not specifically Bird AF. But she says the shift to the technology has been substantial.

She writes, “It’s hard for me to tease out the effects of the ‘animal eye’ function from the general increase in the proportion of sharp shots that I experienced after I made the switch to the mirrorless Sony a9 from Canon DSLRs three years ago. I get many more in-focus keepers in a burst of images than I ever could have achieved before the switch. Sony’s tracking AF is astonishing!”

She also points out the significance of Bird AF and Animal AF features for anyone looking to get into bird photography, writing, “Scroll through any online nature photography forum and it will be obvious that there’s been a huge increase in great bird shots, including some amazing action images, in recent years. The downside is now, the bar has been raised so high.” The best way to stand out from the crowd? Become much more creative with compositions and lighting.

Getting the shot

Henry and Read both offer their strategies for using AF tracking, including bird and animal detect features, to capture their targets.

To get the shot, Henry takes full advantage of the Nikon Z9’s customizability, setting up the camera to seamlessly switch between a variety of AF modes. He writes, “I use a hybrid AF method for birds and wildlife that I teach on my YouTube channel. ‘Wide’ or ‘Small’ area AF on the shutter release (kind of like group in the DSLR days) for fast erratic subjects like birds in flight, with a conversion to ‘3D-tracking’ (Nikon’s name for subject tracking) on the back button to follow a subject you pick up all over the frame. I program a front function button that converts the shutter AF to single point AF-C for those subjects where the eye detection is missing and you need to direct the point. But I leave the 3D-tracking on the back button. A press of the Function 3 button flips the shutter button AF between’ Wide Area’ and ‘Single Point’ that way.” 

A parrot.
Scarlet macaw. ISO 140, 1/125 sec, f/6.3, @ 800mm. © Hudson Henry

Related: Best cameras for wildlife photography

As with so much of photography, a varied approach is required depending on the circumstances. Read shares, “As a Sony a9 shooter, for me the important things are selecting the optimal AF area size and whether or not to use tracking. Because my subjects are usually moving, in general, I use ‘Tracking: Flexible Spot Medium.’ I usually start out with the AF area positioned in the center of the screen but then I move it around as necessary for composition. For birds in flight where the flight pattern is extremely fast and erratic (think small terns or swallows), ‘Tracking: Zone’ can work well, but [it works] best if against a clean background. One more thing to fine-tune AF is via ‘Tracking Sensitivity.’ Sony offers settings from 1 (Locked On) to 5 (Responsive). I have mine set to 2.”

Sometimes the tracking isn’t necessarily better than good old-fashioned manual spot-focusing. Read writes, “It’s not the best idea to shoot a bird against a busy background, especially if it is small in the frame, but in that case try an even smaller AF area (i.e. Sony’s ‘Tracking: Flexible Spot Small’). Shooting through vegetation, which can give a lovely vignetted effect if done properly, is another place where you’d want to use the smallest AF area. You might need to turn the tracking function off to avoid the camera focusing back and forth.” 

Good bill hunting

Good bird photography still requires more equipment than your average camera body and kit lens. And of course, you need to put yourself in the position to photograph birds in their habitats. But with Bird AF and Animal AF technologies in the latest camera models, you’re far more likely to end up with more sharp keepers than in the past.

The post Does bird detect AF really work? appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Excire Foto 2022 can analyze and keyword your entire photo library using AI https://www.popphoto.com/how-to/excire-foto-2022/ Fri, 24 Jun 2022 00:08:53 +0000 https://www.popphoto.com/?p=176189
Photographer at a computer importing photos
Thanks to tools like automatic keywording and duplicate detection, metadata management can take little effort. Getty Images

Tidy up your image database with just a few clicks of the mouse.

The post Excire Foto 2022 can analyze and keyword your entire photo library using AI appeared first on Popular Photography.

]]>
Photographer at a computer importing photos
Thanks to tools like automatic keywording and duplicate detection, metadata management can take little effort. Getty Images

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

I sometimes feel like the odd-photographer-out when it comes to working with my photo library. I’ve always seen the value of tagging images with additional metadata such as keywords and the names of people who appear (I’ve even written a book about the topic). 

However, many people just don’t want to bother. It’s an extra step—an impediment really—before they can review, edit, and share their images. It requires switching into a word-based mindset instead of an image-based mindset. And, well, it’s boring.

And yet, there will come a time when you need to find something in your increasingly growing image collection, and as you’re scrolling through thumbnails and trying to remember dates and events from the past, you’ll think, “Maybe I should have done some keywording at some point.”

In an earlier column, I took a high-level look at utilities that use AI technologies to help with this task. One of the standouts was Excire Foto, which has just been updated to version 2.0 (and branded Excire Foto 2022). I was struck by its ability to automatically tag photos, and also the granularity you can use when searching for images. Let’s take it for a spin.

Related: The best photo editor for every photographer

A few workflow notes

Excire Foto is a standalone app for macOS or Windows, which means it serves as your photo library manager. You point it at existing folders of images; you can also use the Copy and Add command to read images from a camera or memory card and save them to a location of your choice. If you use a managed catalog such as Lightroom or Capture One that tracks metadata in its own way, Excire Foto won’t work as well. A separate product, Excire Search 2, is a plug-in for Lightroom Classic.

Or, Excire Foto could be the first step in your workflow: import images into it, tag and rate them, save the metadata to a disk (more on that just ahead), and then ingest the photos into the managed photo editing app of your choice.

Since the app manages your library, it doesn’t offer any photo editing features. Instead, you can send an image to another app, such as Photoshop, but its edits are not round-tripped back to Excire Foto.

For my testing, I copied 12,574 photos (593 GB) from my main photo storage to an external SSD connected to my 2021 16-inch MacBook Pro, which is configured with an M1 Max processor. Importing them into Excire Photo took about 38 minutes, which entailed adding the images to its database, generating thumbnail previews, and analyzing the photos for content. Performance will depend on hardware, particularly in the analysis stage, but it’s safe to say that adding a large number of photos is a task that can run while you’re doing something else or overnight. Importing a typical day’s worth of 194 images took less than a minute.

Automatic keywording

excire foto 2022
Review and rate photos in Excire Foto 2022. Jeff Carlson

To me, those numbers are pretty impressive, considering the software is using machine learning to identify objects and scenes it recognizes. But still, do you really care about how long an app imports images? Probably not.

But this is what you will care about: In many other apps, the next step after importing would be to go through your images and tag them with relevant terms to make them easier to find later. In Excire Foto, at this point all the images include automatically generated keywords—much of the work is already done for you. You can then jump to reviewing the photos by assigning star ratings and color labels, and quickly pick out the keepers.

I know I sound like a great big photo nerd about this, but it’s exactly the type of situation where computational photography can make a big difference. To not care about keywords and still get the advantages of tagged photos without any extra work? Magic. 

excire foto 2022
The keywords in blue were created by the app, while keywords in gray were ones I added manually. Jeff Carlson

I find that Excire Foto does a decent-to-good job of identifying objects and characteristics in the photos. It doesn’t catch everything, and occasionally adds keywords that aren’t accurate. That’s where manual intervention comes in. You can manually delete keywords or add new ones to flesh out the metadata with tags you’re likely to search for later. For example, I like to add the season name so I can quickly locate autumn or winter scenes. Tags that the software applies appear with blue outlines, while tags you add show up with gray outlines. It’s also easy to copy and paste keywords among multiple images.

All of the metadata is stored in the app’s database, not with the images themselves, so you’re not cluttering up your image directories with app-specific files (a pet peeve of mine, perhaps because I end up testing so many different ones). If you prefer to keep the data with the files, you can opt to always use sidecar files, which writes the information to standard .XMP text files. Or, you can manually store the metadata in sidecar files for just the images you want.

Search that takes search seriously

excire foto 2022
Explore the keyword hierarchy tree to perform specific term searches. Jeff Carlson

The flip side of working with keywords and other metadata is how quickly things can get complicated. Most apps try to keep the search as simple as possible to appeal to the most people, but Excire Foto embraces multiple ways to search for photos.

A keyword search lets you browse the existing tags and group them together; as you build criteria, you can see how many matches are made before running the search. The search results panel also keeps recent searches available for quick access.

excire foto 2022
You can get pretty darn specific with your searches. Jeff Carlson

Or consider the ability to find people in photos. The Find Faces search gives you options for the number of faces that appear, approximate ages, the ratio of male to female, and a preference for smiling or not smiling expressions.

excire foto 2022
The Find Faces interface allows you to search for particular attributes. Jeff Carlson

Curiously, the people search lacks the ability to name individuals. To locate a specific person you must open an image in which they appear, click the Find People button, select the box on the person’s face, and then run the search. You can save that search as a collection (such as “Jeff”), but it’s not dynamically updated. If you add new photos of that person, you need to manually add them to the collection.

excire foto 2022
Search for a person by first opening an image in which they appear and selecting their face identifier. Jeff Carlson

It appears that the software isn’t necessarily built for identifying specific people, instead, it’s looking for shared characteristics based on whichever source image is chosen. Some searches on my face brought up hundreds of results, while others drew fewer hits.

Identifying Potential Duplicates

New in Excire Foto 2022 is a feature for locating duplicate photos. This is a tricky task because what you and I think of as a duplicate might not match what the software identifies. For instance, in my library, I was surprised that performing a duplicate search set to find exact duplicates brought up only 10 matches.

That’s because this criteria looks for images that are the exact same file, not just visually similar. Those photos turned out to be shots that were imported twice for some reason (indicated by their file names: DSCF3161.jpg and DSCF3161-2.jpg).

excire foto 2022
How duplicates like this get into one’s library will forever be a mystery. Jeff Carlson

When I performed a duplicate search with the criteria set to Near Duplicates: Strict, I got more of what I expected. In the 1007 matches, many were groups of burst photos and also a selection of image files where I’d shot in Raw+JPEG mode and both versions were imported. The Duplicate Flagging Assistant includes the ability to reject non-Raw images, or in the advanced options you can drill down and flag photos with more specific criteria such as JPEGs with the short edges measuring less than 1024 pixels, for example.

excire foto 2022
Choose common presets for filtering possible duplicates, or click Advanced Settings to access more specific criteria. Jeff Carlson

As with all duplicate finding features, the software’s job is primarily to present you with the possible matches. It’s up to you to review the results and determine which images should be flagged or tossed.

End Thoughts

It’s always tempting to jump straight to editing images, but ignoring metadata catches us out at some point. When a tool such as Excire Foto can shoulder a large portion of that work, we get to spend more time on editing, which is the more exciting part of the post-production process, anyway.

The post Excire Foto 2022 can analyze and keyword your entire photo library using AI appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Meet Apple’s powerful new M2 MacBook Air https://www.popphoto.com/how-to/apple-wwdc-announcements-2022/ Wed, 08 Jun 2022 21:33:24 +0000 https://www.popphoto.com/?p=174399
Photoshop running on the new MacBook Air.
Photoshop running on the new MacBook Air. Apple

Plus: a first look at macOS 13 Ventura, iOS 16, and more.

The post Meet Apple’s powerful new M2 MacBook Air appeared first on Popular Photography.

]]>
Photoshop running on the new MacBook Air.
Photoshop running on the new MacBook Air. Apple

Apple’s Worldwide Developer Conference (WWDC) kicked off this week with the announcement of a new MacBook Air and first looks at macOS 13 Ventura, iOS 16, iPadOS 16, and watchOS 9. It’s a giant stew of features and technologies meant to excite developers and prepare them for the software releases later this year.

But what about photographers? Several photo-related changes are coming, including improvements that take advantage of computational photography. Given this column’s interest in AI and ML technologies, that’s what I’m mostly going to focus on here.

Keep in mind that the operating system releases are currently available only as betas to developers, with full versions coming likely in September or October. As such, it’s possible that some announced features may be delayed or canceled before then. Also, Apple usually saves some details in reserve, particularly regarding the hardware capabilities of new iPhone models.

That said, here are the things that stood out to me.

The M2-Based MacBook Air and MacBook Pro

Photographers’ infamous Gear Acquisition Syndrome isn’t limited to camera bodies and lenses. The redesigned MacBook Air was the noteworthy hardware announcement, specifically because it’s powered by a new M2 processor.

The new MacBook Air uses Apple's M2 chip.
The new MacBook Air uses Apple’s latest M2 chip. Apple

Related: Testing the advantages of Apple’s ProRAW format

In short, the M2 is faster and better than the M1, which itself was a stark improvement over the Intel-based processors Apple had been using before transitioning to its own silicon. A few standout specs that will interest photographers include: The memory bandwidth is 100 GB/s, 50 percent more than the M1, which will speed up operations in general. (The M-series architecture uses a unified pool of memory for CPU and GPU operations instead of discrete chipsets, increasing performance; up to 24 GB of memory is available on the M2.)

The M2’s 20 billion transistors need more space than the M1’s dimensions
The M2’s 20 billion transistors need more space than the M1’s dimensions. Apple

Photographers and videographers will also see improvements due to 10 GPU cores, compared to 8 on the M1, and an improved onboard media engine that supports high bandwidth 8K H.264 and HEVC video decoding, a ProRes video engine enabling playback of multiple 8K and 4K video streams, and a new image signal processor (ISP) that offers improved image noise reduction.

In short, the M2 offers more power while also being highly efficient and battery-friendly. (The battery life I get on my 2021 MacBook Pro with M1 Max processor is unreal compared to my 2019 Intel-based model, and I’ve heard the fan spin up only on a handful of occasions over the past 6 months.)

The MacBook Air’s design reflects the new MacBook Pro’s flattened profile—goodbye to the distinctive wedge shape that defined the Air since its introduction—and includes two Thunderbolt ports and a MagSafe charging port. The screen is now a 13.6-inch Liquid Retina display that supports 1 billion colors and can go up to 500 nits of brightness.

The MacBook Air is just as slim as its predecessor and available in four colors.
The MacBook Air is just as slim as its predecessor and available in four colors. Apple

Apple also announced a 13-inch MacBook Pro with an M2 processor in the same older design, which includes a TouchBar but no MagSafe connector. The slight advantage of this model over the new MacBook Air is the inclusion of a fan for active cooling, which allows for longer sustained processing.

The M2 MacBook Air starts at $1199, and the M2 MacBook Pro starts at $1299. The M1-powered MacBook Air remains available as the $999 entry-level option.

Continuity Camera

Next on my list of interests is the Continuity Camera feature. Continuity refers to technologies that let you pass information between nearby Apple devices, such as copying text on the Mac and pasting it on an iPad. The Continuity Camera lets you use an iPhone 11 or later as a webcam.

Using a phone as a webcam isn’t new; I’ve long used Reincubate Camo software for this (and full disclosure, wrote a few articles for them). Apple brings its Center Stage technology for following subjects in the frame and Portrait Mode for artificially softening the background. It also features a Studio Light setting that boosts the exposure on the subject (you) and darkens the background to simulate external illumination like a ring light. Apple does these things by using machine learning to identify the subject.

But more intriguing is a new Desk View mode: It uses the iPhone’s Ultra-Wide camera and likely some AI technology to apply extreme distortion correction to display what’s on your desk as if you’re looking through a down-facing camera mounted above you. Other participants on the video call still see you in another frame, presumably captured by the normal Wide camera at the same time.

Continuity Camera uses the iPhone’s cameras as webcams and to show a top-down view of the desktop.
Continuity Camera uses the iPhone’s cameras as webcams to show a top-down view of the desktop. Apple

Acting on Photo Content

A few new features take advantage of the software’s ability to identify content within images and act on it.

The iPhone in iOS 16 will have a configurable lock screen with options for changing the typeface of the current time and including widgets for getting quick information at a glance. If the wallpaper image includes depth information, such as a Portrait Mode photo of someone, the screen automatically places the time behind them (a feature introduced in last year’s watchOS 8 update). It can also suggest photos from your library that would work well as lock screen images.

Awareness of subjects in a photo enable the new iOS 16 lock screen to simulate depth by obscuring the time.
Awareness of subjects in a photo enables the new iOS 16 lock screen to simulate depth by obscuring the time. Apple

Another clever bit of subject recognition is the ability to lift a subject from the background. You can touch and hold a subject, which is automatically identified and extracted using machine learning, and then drag or copy it to another app, such as Messages.

Touch to select a subject and then drag it to another app.
Touch to select a subject and then drag it to another app. Apple

The previous iOS and iPadOS updates added Live Text, which lets you select any text that appears in an image. In the next version, you can also pause any frame of video and interact with the text. Developers will be able to add quick actions to do things like convert currency or translate text.

Photos App Improvements

Apple’s Photos app has always occupied an odd space: it’s the default place for saving and organizing images on each platform, but needs to have enough broad appeal that it doesn’t turn off average users who aren’t looking for complexity. I suspect many photographers turn to apps such as Lightroom or Capture One, but we all still rely on Photos as the gatekeeper for iPhone photos.

In the next update, Apple is introducing iCloud Shared Photo Library, a way for people with iCloud family plans to share a separate photo library with up to six members. Each person can share and receive all the photos, bringing photos from family events together in one library without encroaching on individual personal libraries.

An iCloud Shared Library collects photos from every family member.
An iCloud Shared Library collects photos from every family member. Apple

You can populate the library manually, or use person recognition to specify photos where two or more people are together. Or, you can set it up so that when family members are together, photos will automatically be sent to the shared library.

Other Photos improvements include a way to detect duplicates in the Photos app, the ability to copy and paste adjustments between photos or in batches, and more granular undo and redo options while editing.

Reference Mode on iPad Pro

The last thing I want to mention isn’t related to computational photography, but it’s cool nonetheless. Currently, you can use the Sidecar feature in macOS to use an iPad as an additional display, which is great when you need more screen real estate.

In macOS Ventura and iPadOS 16, an iPad Pro can be set up as a reference monitor to view color-consistent photos and videos as you edit. The catch is that according to Apple’s footnotes, only the 12.9-inch iPad Pro with its gorgeous Liquid Retina XDR display will work, and the Mac must have an M1 or M2 processor. (I added “gorgeous” there; it’s not in the footnotes.)

Use the 12.9-inch M1 iPad Pro as a color-accurate reference monitor.
Use the 12.9-inch M1 iPad Pro as a color-accurate reference monitor. Apple

Speaking of screen real estate, iPadOS 16 finally—finally!—enables you to connect a single external display (up to 6K resolution) and use it to extend the iPad desktop, not just mirror the image. Again, that’s limited to models with the M1 processor, which currently includes the iPad Pro and the iPad Air. But if you’re the type who does a lot of work or photo editing on the iPad, external display support will give you more breathing room.

Extend the iPad Pro’s desktop by connecting an external display.
Extend the iPad Pro’s desktop by connecting an external display. Apple

A new feature called Stage Manager breaks apps out of their full-screen modes to enable up to four simultaneous app windows on the iPad and on the external display. If you’ve ever felt constrained running apps like Lightroom and Photoshop side-by-side in Split View on the same iPad screen, Stage Manager should open things up nicely. Another feature, Display Zoom, can also increase the pixel density to reveal more information on the M1-based iPad’s screen.

More to Come

I’ve focused mostly on features that affect photographers, but there are plenty of other new things coming in the fall. If nothing else, the iPad finally has its own Weather app and the Mac has a full Clock app. That may not sound like much, but it helps when you’re huddled in your car wondering if the rain will let up enough to capture dramatic clouds before sundown, or when you want a timer to remind you to get to bed at a respectable hour while you’re lost in editing.

The post Meet Apple’s powerful new M2 MacBook Air appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Turbocharge your wedding edits with the help of AI https://www.popphoto.com/how-to/edit-wedding-photos-faster-ai/ Fri, 27 May 2022 12:00:00 +0000 https://www.popphoto.com/?p=172905
Lightroom photo
Carol Harrold

Here's how AI tools in Lightroom, Photoshop, and Luminar Neo can help speed up the time it takes to edit a wedding gallery.

The post Turbocharge your wedding edits with the help of AI appeared first on Popular Photography.

]]>
Lightroom photo
Carol Harrold

Photographing someone’s Big Day is a beautiful—and stressful—job, especially if you’re not a seasoned pro. This week, PopPhoto is serving up our best advice for capturing that special kind of joy.

A typical wedding day photoshoot can result in thousands of images. After the photographer has spent hours actively capturing the event, hours of culling and editing still loom ahead of them. In an earlier Smarter Image column, I offered an overview of apps designed to sort and edit your photos faster. For this installment, I want to look at the editing side and how AI tools can shave off some of that time.

Consider this situation: You’ve done your initial sort and now you have a series of photos of the bride. They were made in the same location, but the bride strikes different poses and the framing is slightly different from shot to shot. They could all use some editing, and because they’re all similar they’d get the same edits.

This is where automation comes in. In many apps, you can apply edits to one of the images and then copy or sync those edits to the rest. However, that typically works globally, adjusting the tone and color evenly to each full image. What if the overall photo is fine but you want to increase the exposure on just the bride to make her stand out against the backdrop? Well, then you’re back to editing each image individually.

But not necessarily. The advantage of AI-assisted processing is that the software identifies objects within a scene. When the software can pick out the bride and apply edits only to her—even if she moves within the frame—it can save a lot of time and effort.

For this task I’m looking specifically at three apps: Adobe Photoshop, Adobe Lightroom Classic (the same features appear in the cloud-based Lightroom desktop app), and Skylum Luminar Neo. These apps can identify people and make selective edits on them, and batch-apply those edits to other images.

First, let’s look at the example photos I’m working with to identify what they need. Seattle-based photographer Carol Harrold of Carol Harrold Photography graciously allowed me to use a series of photos from a recent wedding shoot. These are Nikon .NEF Raw images from straight out of the camera. 

An unedited set of six similar photos of the bride.
An unedited set of six similar photos of the bride. Carol Harrold

The bride is in shadow to avoid harsh highlights on a sunny day, so as a consequence I think she would benefit from additional exposure. Although she’s posing in one spot, she faces two different directions and naturally appears in slightly different positions within each shot. A single mask copied between the images wouldn’t be accurate. For the purposes of this article, I’m only focusing on the exposure on the bride, and not making other adjustments.

Adobe Photoshop

One of Photoshop’s superpowers is the Actions panel, which is where you can automate all sorts of things in the app. And for our purposes, that includes the ability to use the new Select Subject command in an automation.

In this case, I’ve opened the original Raw files, which processes them through the Adobe Camera Raw module; I kept the settings there unchanged. Knowing that I want to apply the same settings to all of the files, I’ll open the Actions panel and click the [+] button to create a new action, name it, and start recording. 

Next, I’ll choose Select > Subject, which selects the bride and adds that as a step in the action.

Selecting the subject while recording an action inserts the Select > Subject command as a step.
Selecting the subject while recording an action inserts the Select > Subject command as a step. Carol Harrold

To adjust the exposure within the selection, I’ll create a new Curves adjustment layer. Doing so automatically makes a mask from the selection, and when I adjust the curve’s properties to lighten the bride, the effect applies only in that selection.

I’m using a Curves adjustment to increase exposure on the bride in the first photo, though I could use other tools as well.
I’m using a Curves adjustment to increase exposure on the bride in the first photo, though I could use other tools as well. Carol Harrold

In the interests of keeping things simple for this example, I’ll stick to just that adjustment. In the Actions panel, I’ll click the Stop Recording button. Now I have an action that will select any subject in a photo and increase the exposure using the curve adjustment.

To apply the edits to the set of photos, I’ll choose File > Automate > Batch, and choose the recorded action to run. Since all the images are currently open in Photoshop, I’ll set the Source as Opened Files and the Destination as None, which runs the action on the files without saving them. I could just as easily point it at a folder on disk and create new edited versions.

It’s not exciting looking, but the Batch dialog is what makes the automation possible between images.
It’s not exciting looking, but the Batch dialog is what makes the automation possible between images.

When I click OK, the action runs and the bride is brightened in each of the images.

In a few seconds, the batch process applies the edits and lightens the bride in the other photos.
In a few seconds, the batch process applies the edits and lightens the bride in the other photos. Carol Harrold

The results can seem pretty magical when you consider the time saved by not processing each photo individually, but as with any task involving craftsmanship, make sure to check the details. It’s great that Photoshop can detect the subject, but we’re also assuming it’s detecting subjects correctly each time. If we zoom in on one, for example, part of the bride’s shoulder was not selected, leading to a tone mismatch.

Watch for areas the AI tool might have missed, like this section of the bride’s shoulder.
Watch for areas the AI tool might have missed, like this section of the bride’s shoulder. Carol Harrold

The upside is that the selection exists as a mask on the Curves layer. All I have to do is select the area using the Quick Selection tool and fill the area with white to make the adjustment appear there; I could also use the Brush tool to paint it in. So you may need to apply some touch-ups here and there. 

Filling in that portion of the mask fixes the missed selection.
Filling in that portion of the mask fixes the missed selection. Carol Harrold

Lightroom Classic and Lightroom

Photographers who use Lightroom Classic and Lightroom are no doubt familiar with the ability to sync Develop settings among multiple photos—it’s a great way to apply a specific look or LUT to an entire set that could be a signature style or even just a subtle softening effect. The Lightroom apps also incorporate a Select Subject command, making it easy to mask the bride and make our adjustments.

With the bride masked, I can increase the exposure just on her.
With the bride masked, I can increase the exposure just on her. Carol Harrold

In Lightroom Classic, with one photo edited, I can return to the Library module, select the other similar images, and click the Sync Settings button, or choose Photo > Develop Settings > Sync Settings. (To do the same in Lightroom desktop, select the edited photo in the All Photos view; choose Photo > Copy Edit Settings; select the other images you want to change; and then choose Photo > Paste Edit Settings.)

However, there’s a catch. The Select Subject needs to be reprocessed before it will be applied. In Lightroom Classic, when you click Sync Settings, the dialog that appears does not select the Masking option, and includes the message “AI-powered selections need to be recomputed on the target photo.”

Lightroom Classic needs to identify the subject in each image that is synced from the original edit.
Lightroom Classic needs to identify the subject in each image that is synced from the original edit. Carol Harrold

That requires an additional step. After selecting the mask(s) in the dialog and clicking Synchronize, I need to open the next image in the Develop module, click the Masking button, and click the Update button in the panel. 

It’s an extra step, but all you have to do is select the mask and click Update.
It’s an extra step, but all you have to do is select the mask and click Update. Carol Harrold

Doing so reapplies the mask and the settings I made in the first image. Fortunately, with the filmstrip visible at the bottom of the screen, clicking to the next image keeps the focus in the Masking panel, so I can step through each image and click Update. (The process is similar in the Edit panel in Lightroom desktop.)

As with Photoshop, you’ll need to take another look at each image to ensure the mask was applied correctly, and add or remove portions as needed.

Luminar Neo

I frequently cite Luminar’s image syncing as a great example of how machine learning can do the right thing between images. Using the Face AI and Skin AI tools, you can quickly lighten a face, enhance the eyes, remove dark circles, and apply realistic skin smoothing, and then copy those edits to other photos. From the software’s point of view, you’re not asking it to make a change to a specific area of pixels; it knows that in each photo it should first locate the face, and then apply those edits regardless of where in the frame the face appears.

I can still do that with these photos, but it doesn’t help with the exposure of the bride’s entire body. So instead, I’ll use the Relight AI tool in Luminar Neo and increase the Brightness Near value. The software identifies the bride as the foreground subject, increasing the illumination on her without affecting the background.

Luminar Neo’s Relight AI tool brightens the bride, which it has identified as the foreground object.
Luminar Neo’s Relight AI tool brightens the bride, which it has identified as the foreground object. Carol Harrold

Returning to the Catalog view, we can see the difference in the bride’s exposure in the first photo compared to the others. 

Before syncing in Luminar Neo
Carol Harrold

To apply that edit to the rest, I’ll select them all, making sure the edited version is selected first (indicated by the blue selection outline), and then choose Image > Adjustments > Sync Adjustments. After a few minutes of processing, the other images are updated with the same edit. 

After syncing, the image series now features the lightened bride.
After syncing, the image series now features the lightened bride. Carol Harrold

The results are pretty good, with some caveats. On a couple of the shots, the edges are a bit harsh, requiring a trip back to the Relight AI tool to increase the Dehalo control. I should also point out that the results you see above were from the second attempt; on the first try the app registered that it had applied the edit, but the images remained unchanged. I had to revert the photos to their original states and start over.

The latest update to Luminar Neo adds Masking AI technology, which scans the image and makes the individual areas it finds selectable as masks, such as Human, Flora, and Architecture. I thought that it would allow me to identify a more specific mask, but instead, it did the opposite when synced to the rest, applying the adjustment to what appears to be the same pixel area as the source image.

Unfortunately, the Masking AI feature doesn’t work correctly when syncing adjustments between photos.
Unfortunately, the Masking AI feature doesn’t work correctly when syncing adjustments between photos. Carol Harrold

The AI Assistant

Wedding photographers often work with one or more assistants, so think of these AI-powered features as another assistant. Batch processing shots with software that can help target adjustments can help you turn around a large number of images in a short amount of time.

The post Turbocharge your wedding edits with the help of AI appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Testing the advantages of Apple’s ProRAW format https://www.popphoto.com/how-to/apple-proraw-explained/ Thu, 12 May 2022 20:39:05 +0000 https://www.popphoto.com/?p=171727
The US Capitol building.
Captured on an iPhone 13 Pro in Apple ProRAW and processed through Apple Photos. Jeff Carlson

Apple ProRAW is a hybrid file format that combines the flexibility of a traditional Raw file with the benefits of AI-powered image processing.

The post Testing the advantages of Apple’s ProRAW format appeared first on Popular Photography.

]]>
The US Capitol building.
Captured on an iPhone 13 Pro in Apple ProRAW and processed through Apple Photos. Jeff Carlson

Photo technology continually advances, and generally, that’s great for photographers. But let’s be honest, lately, that pace seems to be overwhelming. It often feels as if we don’t have much choice between embracing or rejecting the changes. 

In a recent Smarter Image column, I wrote about how to outsmart your iPhone camera’s overzealous AI. The author of a New Yorker article bemoaned Apple’s computational photography features for creating manipulated images that look “odd and uncanny.” My column pointed out that by using third-party apps, it’s possible to capture photos that don’t use technologies like Deep Fusion or Smart HDR to create these blended images.

Although true, that also feeds into the idea that computational photography is an either/or choice. Don’t like the iPhone’s results? Use something else. But the situation isn’t that reductive: sometimes smart photo features are great, like when you’re shooting in low light. A quick snap of the iPhone (or Google Pixel, or any other computationally-enhanced device) can seize a moment that would otherwise be lost with a regular camera while you’re fiddling with settings to get a well-exposed shot.

How can we take advantage of the advancements without simply accepting what the camera’s smart processing gives us? 

The promise of Raw

This isn’t a new question in digital photography. When you capture a photo using most cameras, even the simplest point-and-shoot models, the JPEG that’s created is still a highly processed version of the scene based on algorithms that make their own assumptions. Data is then thrown out to make the file size smaller, limiting what you can do during editing.

One answer is to shoot in Raw formats, which don’t make those assumptions in the image file. All the data from the sensor is there, which editing software can use to tease out shadow detail or work with a range of colors that would otherwise be discarded by JPEG processing.

If you’ve photographed difficult scenes, though, you know that shooting Raw isn’t a magic bullet. Very dark areas can be muddy and noisy when brightened, and there’s just no way back from an overexposed sky comprised of all-white pixels.

The ProRAW compromise

This swings us back to computational photography. Ideally, we want the exposure blending features to get an overall better shot: color and detail in the sky and also plenty of shadow detail in the foreground. And yet we also want the color range and flexibility of editing in Raw for when we need to push those values further.

(News flash: We’re photographers, we want it all, and preferably right now thank you.)

Apple’s ProRAW format attempts to do both. It analyzes a scene using machine learning technology, identifying objects/subjects and adjusting exposure and color selectively within the frame to create a well-balanced composition. At the same time, it also saves the original Raw sensor data for expanded editing.

There’s a contradiction here, though. As I mentioned in Reprocess raw files with machine learning for cleaner-looking photos, a Raw file is still just unfiltered data from the sensor. It doesn’t specify that, say, a certain range of pixels is a sky and should be rendered with more blue hues. Until software interprets the file through the demoasicing process, the image doesn’t even have any pixels.

Apple’s ProRAW solution is to create a hybrid file that actually does include that type of range-specific information. ProRAW files are saved in Adobe’s DNG (digital negative) format, which was designed to be a format that any photo editing software could work with (versus the still-proprietary Raw formats that most camera manufacturers roll with). It’s important to point out that ProRAW is available only on the iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro, and iPhone 13 Pro Max models

To incorporate the image fusion information, Apple worked with Adobe to add ProRAW-specific data to the DNG specification. If an editing app understands that additional information, the image appears as it does when you open it on the iPhone, with editing control over those characteristics. If an app has not been updated to recognize the revised spec, the ProRAW data is ignored and the photo opens as just another Raw image, interpreting only the bare sensor data.

So how can we take advantage of this?

Editing ProRAW photos

In my experience, ProRAW does pretty well with interpreting a scene. Then again, sometimes it just doesn’t. A reader pointed out that the photos from his iPhone 12 Pro Max tend to be “candy-colored.” Editing always depends on each specific photo, of course, but reducing the Vibrance or Saturation values will help that; the Photographic Styles feature in the iPhone 13 and iPhone 13 Pro models can also help somewhat, although the specific attributes you can change are tone and warmth, not saturation specifically. And, of course, that feature is only on the latest phones. 

With the iPhone 13 Pro, my most common complaint is that sometimes ProRAW images can appear too bright—not due to exposure, but because the image processor is filling in shadows where I’d prefer it to maintain darks and contrast.

Let’s take a look at an example.

Editing ProRAW files in Apple Photos

In this ProRAW photo shot a few weeks ago with my iPhone 13 Pro, Apple’s processing is working on a few separate areas. There’s a lot of contrast in the cloudy sky, nice detail and contrast on the building itself, and plenty of detail on the dark flagpole base in the foreground.

The US Capitol building.
The photo is straight out of an iPhone 13 Pro. Jeff Carlson

Want to see the computational photography features at work? When I adjust the Brilliance slider in Apple Photos, those three areas react separately.

Features photo

Moving the Brilliance slider in Apple Photos adjusts the foreground, building, and sky separately.

However, I think this is an instance where the processing feels too aggressive. Yes, it’s nice to see the detail on the flagpole, but it’s fighting with the building. Reducing Brilliance and Shadows makes the image more balanced to my eyes.

The US Capitol building.
Reducing the brilliance and shadows results in a pleasing image. Jeff Carlson

The thing about the Photos app is that it uses the same editing tools for every image; Brilliance can have a dramatic effect on ProRAW files, but it’s not specifically targeting the ProRAW characteristics.

Editing ProRAW files in lightroom

So let’s turn our attention to Lightroom and Lightroom Classic.

The US Capitol building.
Here’s what the photo looks like when opened in Lightroom. The app recognizes the format and applies the Apple ProRaw profile. Jeff Carlson

Adobe’s solution for working with that data is to use a separate Apple ProRaw profile. If we switch to another profile, such as the default Adobe Color, the Apple-specific information is ignored and we get a more washed out image. That can be corrected using Lightroom’s adjustment tools, of course, because the detail, such as the clouds, is all in the file.

The US Capitol building.
With the Apple Color profile applied, much of the contrast and dark values are lost. Jeff Carlson

With the Apple ProRaw profile applied, though, we can adjust the Profile slider to increase or reduce the computational processing. Reducing it to about 45, in this case, looks like a good balance.

The US Capitol building.
Adjusting the profile amount creates an image with better tones. Jeff Carlson

Editing ProRAW files in RAW Power

The app RAW Power takes a similar approach, but with more granularity in how it processes raw files. For ProRAW photos, a Local Tone Map slider appears. Initially, it’s set to its maximum amount, but reducing the value brings more contrast and dark tones to the flagpole.

The US Capitol building.
RAW Power controls the ProRAW areas using a separate Local Tone Map slider. Jeff Carlson

This is just one example image, but hopefully, you understand my point. Although it seems as if computational processing at the creation stage is unavoidable, I’m glad Apple (and I suspect other manufacturers in the future) are working to make these new technologies more editable. 

The post Testing the advantages of Apple’s ProRAW format appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Testing 3 popular AI-powered sky replacement tools https://www.popphoto.com/how-to/use-ai-to-replace-a-sky/ Wed, 27 Apr 2022 19:30:27 +0000 https://www.popphoto.com/?p=169784
A reflection in Mono Lake at sunset.
The sky and water reflection in this image were both replaced using tools in ON1 Photo RAW 2022. Jeff Carlson

In this week's Smarter Image column, we're looking at sky replacement features in Adobe Photoshop, Luminar Neo, and On1 Photo RAW.

The post Testing 3 popular AI-powered sky replacement tools appeared first on Popular Photography.

]]>
A reflection in Mono Lake at sunset.
The sky and water reflection in this image were both replaced using tools in ON1 Photo RAW 2022. Jeff Carlson

AI-assisted photo technologies mostly exist to help you save time while editing, or improve image quality using small sensors or when processing images. But sometimes they can radically change your photos, as in the case with sky replacement features.

Swapping a sky in a photo was initially a head-scratcher for me. One of the appeals of landscape photography, for instance, is to get out amid nature and experience the colors and wonder of a sunrise or sunset. Doing that takes work: planning the shoot, determining the best time to arrive and set up, checking weather estimates, picking a composition, and sometimes standing around in cold weather waiting for the show to begin.

But with AI sky replacement, you could theoretically show up at any time, hang your camera out the car window, snap a shot, and then add someone else’s spectacular sky using your computer later. It feels like cheating and reinforces the feeling of many photographers that AI technologies are marginalizing craft and hard work. 

That’s an awfully traditional mindset, though, and I had to remember that photography encompasses a larger spectrum than my experience. Sky replacement is useful in real estate photography, where it’s rarely possible to wait around a house for ideal conditions, especially if you’re shooting three houses that day. Or you may need a better sky for an online advertisement.

Or you might be a landscape photographer who did put in the work, got skunked by a flat sky in a location you can’t easily return to, and want to make a creative composition anyway. We forget that most photography is art, and doesn’t need to hew to journalistic expectations of accuracy.

AI Sky Replacement

Replacing skies isn’t new. With patience, you could use software that supports layers to define a mask for the sky and put another sky image in its place. That takes time, particularly if the sky is interrupted by objects such as tree branches or a complicated skyline.

The goal of a successful sky swap is, of course, to make it appear as if the new sky was originally there all along. But that incorporates several pieces:

  • The sky should have a clean edge, taking into account interruptions. This is usually the most difficult part because the software must determine which areas belong to the sky and which belong to the foreground.
  • The non-sky elements of the image need to match the exposure and coloring that the new sky would cast over the scene. A sky isn’t just background—it’s the light source and filter for everything we see.
  • There needs to be a way for you to fine-tune the mask and the color in areas the software didn’t catch.
  • The tool should take into account reflections. Nothing ruins the illusion like a new sky with the original sky reflected in the water below.

And let’s not forget the obvious, which is the responsibility of the editor: Make sure light sources match and shadows are cast in the correct direction. After all, the goal is to present the illusion of a natural sky, and those are obvious flags that can ruin the effect.

Several photo editing apps include sky replacement features, each of them taking slightly different approaches. For this article, I’m looking at Adobe Photoshop, Skylum Luminar Neo, and ON1 Photo RAW 2022. I’m also applying sky images that are included in each app. You can add your own images to each one, too.

Below are the two test images will use.

hospital under grey skies
A hospital under cloudy skies will be the first test image. Jeff Carlson
Mono Lake
The second test is this photo of Mono Lake. Jeff Carlson

Photoshop Sky Replacement

You could say Photoshop is the original sky replacement utility since its layers and selection tools were what you needed to use. Now, Adobe includes a specific Sky Replacement tool: Choose Edit > Sky Replacement.

In my first test image, the ruins of a hospital, the feature right away has done a good job of replacing the sky, including in the windows where the sky shows through. The edges are clean, including the tree branches that have grown up beyond the top of the wall.

how to replace a sky in photoshop
Photoshop’s Sky Replacement looks convincing from the start. Jeff Carlson

It includes controls for shifting and fading the mask edge, adjusting the brightness and temperature of the sky, and moving the sky image itself, both using a Scale slider and by dragging with the Move tool.

Switching to a sunset image also shows that the foreground lighting is adapting to the new sky, with options for adjusting the blend mode and lighting intensity. The Sky Brush tool allows some manipulation of the edges.

sky replacement in photoshop
The sunset image in Photoshop adjusts the lighting on the foreground. Jeff Carlson

And typical of Photoshop, the default output option is to create new layers that include all the pieces: a masked sky image, a foreground lighting layer with its own mask, and adjustment layers for the colors. It’s nicely editable.

sky replacement photoshop tutorial
You say you love layers? Photoshop outputs all of its sky components into their own layers. Jeff Carlson

Notably missing, though, is recognition of reflective areas. When I apply a sky to an image of Mono Lake in Photoshop, the sky is changed but the glassy lake remains the same.

photoshop tutorial sky replacement
Something’s missing here in Photoshop. Jeff Carlson

Luminar Neo Sky AI

When I open the first image in Luminar Neo and choose an image from the Sky AI tool, the initial replacement is also pretty good. It has detected the top-right window, but not the openings in the center. And it’s unsure about the branches sticking up from the top of the wall, mostly catching their detail but also revealing an obvious halo and some of the original gray clouds.

replace sky Luminar Neo
Luminar Neo also does a good job, with a few hiccups. Jeff Carlson
luminar neo sky AI replacement
Looking at the branches close up reveals areas of the original image coming through.

To handle these discrepancies, Luminar uses a trio of Mask Refinement controls—Global, Close Gaps, and Fix Details—which to be honest are best used by sliding them and seeing what happens. In this case, increasing Global and reducing Close Gaps helps with the branches.

sky replacement luminar neo AI
Adjusting the Mask Refinement controls improves the treatment of the branches. Jeff Carlson

However, none of the controls can coax the sky into the windows at the bottom. That’s because the algorithm that detects the sky has decided they’re not part of the mask, and there’s nothing I can do to convince it otherwise. The Sky AI tool includes a manual Mask tool (as do most of Luminar’s tools), but in this case I can paint in rough areas using only a brush tool, exposing or hiding only the areas the AI has generated.

The Scene Relighting controls do a pretty good job of adapting the exposure and color and even include a “Relight Human” slider to adjust the appearance of the sky’s color when people are detected in the scene. I also appreciate the Sky Adjustments controls that help you match the sky to the rest of the image, such as defocusing it or adding atmospheric haze. However, note that the lighting isn’t really the problem here; with the sun setting behind the structure, more of the foreground would naturally be in shadow, illustrating the importance of the editor choosing appropriate imagery.

how to replace sky luminar neo
A late sunset image casts a darker hue to the foreground. Jeff Carlson

Where Sky AI excels over Photoshop is its reflection detection, which in the Mono Lake image has created a convincing sky and reflection. I have the ability to adjust the opacity of the reflected image and also apply “water blur” to it.

sky replacement luminar neo tutorial
Luminar Neo’s reflection looks natural in this photo. Jeff Carlson

ON1 Photo RAW 2022

In ON1 Photo RAW 2022, the swapped sky has its pluses and minuses. It’s identified all the window openings correctly and handled the intruding branches pretty well. However, there’s obvious haloing around the top edges of the building, a telltale sign of a swapped sky.

how to replace sky ON1 Photo Raw
At the start, the new sky in ON1 Photo RAW 2022 has obviously been added. Jeff Carlson
On1 Photo Raw sky replacement tutorial
The branches look fine, but the glow around the walls makes them seem otherworldly. Jeff Carlson

That can be mitigated using the Fade Edge and Shift Edge controls, but not entirely. Increasing the fade can sometimes make the edit less noticeable. Also, note that the choice of sky can be more or less effective.

sky replacement ON1 Photo Raw
Fading and shifting the edge of the mask helps, but it’s still noticeable. Jeff Carlson

The foreground lighting controls let me adjust not only the amount and blend mode of the effect, but also the color itself using an eyedropper tool, which provides more control.

how to replace sky ON1 Photo Raw
With this sunset image and the foreground coloring, the entire shot looks more natural. Jeff Carlson

ON1 Photo RAW 2022 does include reflection awareness, with controls for setting the opacity of the image and the blend mode.

On1 Photo Raw sky replacement
Now that’s what I was hoping to see when I went to Mono Lake, but I was only able to be there in the middle of the afternoon. I’d also need to do more work to reduce the exposure in the foreground due to the light source being low in the sky. Jeff Carlson

Skies Wide Open

As you can see, replacing a sky is a tricky feat. It can be made easier using AI technologies, but there’s still more work involved. In both Luminar Neo and ON1 Photo RAW, it’s possible you’d do part of the work there and then clean up the image in Photoshop.

The post Testing 3 popular AI-powered sky replacement tools appeared first on Popular Photography.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>