How to create distraction-free vacation memories with Apple’s Photo Cleanup tool

Photo of author

By [email protected]


image.jpg

CNET

When you’re taking holiday photos, one of the first considerations you’ll probably make is whether you’ve inserted the subject into the viewfinder and whether there’s anything untoward. Nothing can ruin a great photo like something you don’t want to see in it. , such as an unwanted guest or an awkward situation in the background.

But if it does happen, you can breathe a sigh of relief. Apple intelligence Have you covered. All you have to do is use the cleaning feature in the Photos app iOS 18 and Mac Sequoiaand you’ll be positively golden.

Clean Up analyzes the image, suggests elements you might want to remove, such as people or vehicles in the background, and then fills in the deleted area. Sometimes the fix is ​​invisible to most viewers — and sometimes the results are laughably bad. After running many types of images through the tool, I’ve come up with some general guidelines to help you get the best clean images.

Two photos of a brick building along an uphill street. In the first, a series of vertical traffic columns are distracting. In the second, the posts were removed.

The cleaning tool can remove distractions.

Jeff Carlson/CNET

Surprisingly, the Photos app on iPhone and iPad has never had a tool like Clean Up to remove small distractions. The Mac version includes a basic retouching tool that can fix some areas, which is replaced by the Clean Up tool on compatible Macs.

But it’s important to remember that Clean Up is an Apple Intelligence feature, so you’ll only see it if you’re using a compatible device and have been granted access to Beta version of Apple Intelligence. This includes iPhones with iOS 18.1, iPads with M-series processors (and iPad mini with the A17 Pro chip) in iPadOS 18.1, and Macs with M-series processors in MacOS Sequoia 15.1.

To learn more about Apple intelligence, see What features do I think you’ll use the most? And where is he? Notifications need improvement.

How is cleanup different from other retouching tools?

The repair tools in most photo editing applications work by copying nearby or similar pixels to fill the space where you are repairing. They’re great for removing lens flares or dust spots on the sky, for example.

The cleaning tool uses generative AI, which analyzes the entire scene and guesses what should fill the area you’ve selected. If you want to remove a dog standing in front of a tree, for example, generative AI creates a replacement based on what it knows about the texture of the tree and foliage in the background, and also takes into account the level and direction of lighting in the image.

CNET_Technical Tips

The “generative” part of generative AI comes from the way it creates the image. The pixels that fill the area literally come from nothing: the program starts with a random pattern of dots and quickly iterates to create what it determines will appear in the same space.

Keep in mind that retouching tools that use generative AI are the ultimate YMMV, or “your mileage may vary.” I got good results in difficult compositions and terrible results in areas that I thought would be easy to handle with the application.

Watch this: Apple Intelligence Impressions: Don’t expect a radical change

How to remove distractions with Apple’s Cleanup Tool

The Cleanup tool takes two approaches to repair images. Using machine learning, it suggests elements such as people or vehicles in the background as potential items to remove. Or you can drag over what you want to remove and direct the images to work in that area. The process is divided as follows:

1. Open an image and tap Edit button. (In MacOS, click the button labeled Editor press the Return key.)

2. handle cleaning. The first time you use the tool, the Photos app needs to download the cleaning resources, which will take a minute or so depending on your internet connection. The Photos app analyzes the image and highlights any potential artifacts that could be removed using a transparent flash.

Two iPhone screenshots of a bearded man taking a selfie. In the background there are pedestrians and cars. The figure on the right shows the Photos app's cleaning interface with arrows identifying highlighted items.

Open the photo editing interface, then click “Clean up”. The Photos app offers suggestions on what to remove.

Screenshot by Jeff Carlson/CNET

3. To remove a suggested item, tap it. Or draw a circle around any non-glowing item.

4. Don’t be surprised if the area isn’t completely cleaned on the first try, as you may need to paint over the remaining areas for further removal. If you are not satisfied with the fix, click to retreat button.

Close-up of people removed in the background behind a man taking a selfie. In the photo on the left, people are highlighted except for one person's legs. On the right, the same image with the selection made to clean the legs.

If cleaning doesn’t disable everything — note that the person’s legs in the photo on the left are not highlighted — use the tool again to continue cleaning the area.

Screenshot by Jeff Carlson/CNET

5. When finished, press finished. As with all photo edits, you can go back to the original if you want to start over: tap more (…) button and select Back to the original.

An unexpected and cool feature: the security filter

Primarily, you’ll use the Cleanup tool to get rid of distracting elements in the scene, but it has another trick up its sleeve: you can mask the identity of someone in the photo.

Draw a circle around their faces. You don’t have to fill it out, a universal pass will do the job. The images apply a blocky mosaic pattern in place of the person’s face to hide this.

Two pictures of a man taking a selfie. On the left is a circular outline around his face. On the right, the face has been replaced with a mosaic grid.

The safety filter is a smart use of a cleaning tool.

Screenshot by Jeff Carlson/CNET

That’s where you’ll see the most success with the cleaning process

Some scenes and areas work better with the cleanup feature, so it’s good to know where to focus your efforts.

In my testing, I found the most success in these general categories of fixes:

  • Small deviations. Items like trash on the floor or dust and threads on people’s clothes are always good.
  • Background textures. Areas such as tree leaves, grass or stone can be replicated well.
  • Lens flare. As long as it’s not too large, lens flare is caused by light bouncing between the camera’s lens elements
  • pedestrians or vehicles In the background that does not take up much space.
  • Areas with sparse detail or background.

Examples of cleaning in action: removing lens flare from a photo of a ship in harbor at sunset; Drawing a bag next to two people sitting on a giant pumpkin; Removing an out-of-focus dog in the background behind a close-up of a flower.

Sometimes, the cleaning process works well – the originals are on top, the modified versions are on the bottom.

Jeff Carlson/CNET

In general, when dragging in an area, make sure to capture reflections or shadows cast by the item you want to remove. Fortunately, photos often capture these elements and will include them in their selection.

Three iPhone screens using the cleaning feature in the Photos app. A couple takes a photo in front of a rock wall painted in rainbow colors at a Stockholm subway station. It has been highlighted. The program selects it; Removed.

Make sure Shadows and Reflections (left) are selected. Cleanup detects what needs to be removed based on the broad (center) selection. There’s just a little bit of reflection left (on the right), but it can be cleaned up with one more pass of the tool.

Screenshot by Jeff Carlson/CNET

Areas to avoid when trying to use cleaning

Some cleaning goals will frustrate you when you try to remove them. For example:

  • Very large spaces. If it is too large, the Photos app will reject and ask you to mark a smaller area otherwise it will make a mess of the area. It is also inconsistent to come up with what could reasonably appear in such a large space.
  • Busy areas with clearly defined features. Foliage from distant trees generally works well, but not so when there are recognizable structures or elements. For example, removing a protruding leaf from a pile of papers, or moving people away from recognizable landmarks, does not go well.

Two photos of a woman and a child in an outdoor market. The child sits next to an orange traffic cone. The woman stands away from the camera. In the photo on the right, the attempt to remove the woman resulted in visual chaos.

Removing large objects in the frame results in a patchwork of pixels.

Screenshot by Jeff Carlson/CNET

Where cleaning needs more work

Remember, Clean Up and other Apple Intelligence features are still technically in beta, though they’re available to anyone with a compatible device who signs up for the beta program. (I have some thoughts about Install beta software In general.)

And while you can get some good results, there are still some areas that I look forward to Apple improving in future releases. That is, the quality of the replaced areas is spotty, and sometimes looks like non-AI repair tools. I expected Apple’s algorithms to do a better job of identifying what’s in the scene and building alternative areas.

In terms of user experience, if you don’t like what Clean Up Removal has to offer, your only options are to undo or reset the edit. If you undo and then try again, you will get the same results that were already processed. By contrast, Adobe Lightroom offers three possibilities for each fix, with the option to create another set if you don’t like what came out.

Three screenshots of Lightroom removing a bag next to a giant pumpkin. Each screen displays a different replacement option.

Lightroom (iPhone app shown here) gives you three options for the area that was removed.

Screenshot by Jeff Carlson/CNET

Clean Up — and other similar AI-based removal tools — also suffers from its expected expectations: We’ve seen where it can do great things, raising the bar for what we think every mod should do. When the tool gets cluttered and presents a mess of disparate pixels, we expect it to perform better. Maybe in future releases.

To learn more about what Apple Intelligence offers for your Apple devices, get… A peek into visual intelligence.

Check out the cameras, display and colors of the iPhone 16 Pro Max

See all photos





https://www.cnet.com/a/img/resize/db8534f39f6638a91759c976f15903789876600c/hub/2024/11/08/f56f482d-807b-441d-8bd9-7b5c13dfc1ca/clean-up-street-2up-16×9-v2.jpg?auto=webp&fit=crop&height=675&width=1200

Source link

Leave a Comment