I like the look of mood.camera’s high-contrast black & white preset (the ‘Noir 800-C’ film stock), and would like to use it in other photo apps. Unlike most photo editors or camera apps with customizable looks, mood.camera doesn’t support LUT imports:
Why can't I import custom presets, instead of manually dialing in the settings?
This was actually an intentional choice. I didn't want people to just blindly import a bunch of presets, I wanted to them to understand the settings. I also didn't want users to be overwhelmed with choice, like the 1000s of Lightroom presets out there, it's so hard to settle on the "right one".
Instead, there’s twenty built-in ‘emulation’s (default iPhone colors + 19 simulations of film stocks), along with various sliders, allowing users to choose a look they like without having total control over the details. I was curious how this was implemented, and if it had anything I could use.
My starting point was IPATool, which allowed me to download the application.
brew install ipatool
ipatool auth login --email '[email protected]'
ipatool search mood.camera \
--format json \
| jq \
--raw-output \
'.apps[]
| select(.name | contains("mood.camera"))
| [.bundleID, .name]
| join(": ")'
#=> com.alexfox.camx: mood.camera
ipatool download \
--bundle-identifier com.alexfox.camx \
--output mood.camera.ipaThis results in a local mood.camera.ipa file. .ipa files are just ZIP files, so the unzip tool can process them.
unzip -l mood.camera.ipa | head
#=> Archive: mood.camera.ipa
# Length Date Time Name
# --------- ---------- ----- ----
# 0 01-01-1980 00:00 META-INF/
# 381 01-01-1980 00:00 META-INF/com.apple.ZipMetadata.plist
# 23 01-01-1980 00:00 META-INF/com.apple.FixedZipMetadata.bin
# 0 12-10-2025 21:46 Payload/
# 0 12-10-2025 15:28 Payload/CamX.app/
# 0 12-10-2025 21:46 Payload/CamX.app/_CodeSignature/
# 85065 12-10-2025 15:28 Payload/CamX.app/_CodeSignature/CodeResources
unzip \
-j \
mood.camera.ipa \
"Payload/CamX.app/*.png" \
-x 'Payload/CamX.app/**/*.png' 'Payload/CamX.app/AppIcon*' Payload/CamX.app/playstore.png
#=> Archive: mood.camera.ipa
# inflating: arizona.png
# inflating: skinTones.png
# inflating: halationGradient.png
# inflating: toneHalide.png
# extracting: grain-texture-superfine.png
# inflating: taiga.png
# inflating: vista.png
# inflating: portra.png
# inflating: toneHoli.png
# inflating: stock.png
# inflating: toneCyano.png
# inflating: metro.png
# extracting: grain-texture-coarse.png
# inflating: cine.png
# inflating: nord.png
# extracting: grain-texture-medium.png
# inflating: toneAutumn.png
# inflating: toneVenus.png
# inflating: tungsten.png
# inflating: toneSepia.png
# inflating: chrome.png
# inflating: toneOmbre.png
# inflating: toneCandy.png
# inflating: xpro.png
# inflating: tonePurple.png
# inflating: apollo.png
# inflating: noir.png
# extracting: grain-texture-fine.png
# inflating: xenon.png
# inflating: mono.png
# inflating: calypso.png
# inflating: luminosity.png
# inflating: prologue.png
# inflating: analog.png
# inflating: gold.pngThis gives us useful information on how the film emulation works. The files with names that match the various emulations (like Apollo, Calypso, Xenon) are Hald CLUTs that transform the colors in a captured image into the film emulation’s ‘look’. For example, here’s the XPro CLUT:
Trying to load these CLUTs into other applications (for example, a tool to convert Hald CLUTs to a .cube file doesn’t work as expected. The ‘stock’ emulation (default iPhone colors) produces bizarre green output:
We need to make two changes to the Hald CLUTs:
- The files are in Apple’s custom CgBI format, which puts pixels in BGRA order instead of the standard RGBA. The easiest way to fix this is the
sipsutility, telling it to convert one of these CgBI files into a standard PNG file. - The Hald CLUTs are structured with 8 columns and 64 rows, but at least some tools expect a square grid, e.g. 8×8. This can be fixed by mapping the CLUT’s data to the square structure.
The attached normalize-hald-cluts.sh and normalize-hald-clut.py files automatically perform these changes on each of the app’s Hald CLUTs and produce 8×8 CLUTs that work as expected. Download those two files, then run
pip install pillow
chmod +x normalize-hald-cluts.sh
./normalize-hald-cluts.shHere’s the normalized XPro CLUT the scripts produce, and how it looks in the LUT converter tool.
The unzip command extracted some additional images that explain how other parts of the app’s preset system work:
- Grain is applied by mixing the image with
grain-texture-coarse.png,grain-texture-medium.png,grain-texture-fine.png, orgrain-texture-superfine.png. - Halation uses a one-dimentional LUT (
halationGradient.png), perhaps to map brightness to the intensity of the flare. - The Mono and Noir emulations enable an additional ‘mono tint’ wheel with options that correspond to the
tone*.pngfiles (toneAutumn.png, etc); these files are one-dimensional LUTs that map the black-to-white luminosity range to a custom color palette. - There’s a
skinTones.pngthat presumably does some kind of tone mapping but I don’t know how it works.
Most of the preset controls (saturation, hue, aberration, shadow/midtone/highlight curves, etc) are implemented in code instead of using LUTs. This means there are significant aspects of the presets that can’t be imported to other apps via the Hald CLUT.