Category: Tricks And Tips

  • Community Projects: Part 2

    DiskEncrypter – a shared shell script helps the world

    A Community Code Evolution Story: From Simple Utility to Security Tool

    Introduction

    What started as a straightforward 515-line bash script in October 2022 has evolved into a, uh, slightly more complicated, 1,125-line “enterprise-grade” encryption enforcement system.

    This is the story of how a simple external drive encryption script was shared with the community and provided a starting point for others to create their own “enhanced” utility which can now be one piece of a security solution that protects users from data loss, encourages users to encrypt external drives, and provides a framework to enforce an encryption policy.

    This DiskEncrypter_Enhanced.sh script is also a journey that spans nine major versions, each addressing real-world problems discovered through testing and user feedback. Along the way, I learned valuable lessons about user experience, data safety, and the importance of preventing accidental data loss—especially when it comes to irreplaceable photos and videos on camera cards.

    Chapter 1: The Beginning – DiskEncrypter.sh v1.0 (October 2022)

    The Original Vision

    Created by Thijs Xhaflaire, the original `DiskEncrypter.sh`script had a clear mission: automatically detect unencrypted external drives and prompt users to encrypt them. It was designed for macOS enterprise environments where data security compliance required all removable media to be encrypted.

    What It Did Well

    The original script handled the basics competently:

    – Detected external APFS, HFS+, and ExFAT/FAT volumes

    – Prompted users with a swiftDialog interface

    – Encrypted APFS volumes directly

    – Converted HFS+ volumes to APFS before encryption

    – Erased and reformatted ExFAT/FAT volumes as encrypted APFS

    – Offered a “mount as read-only” option for users who didn’t want encryption

    The Hidden Limitations

    But as deployments grew, limitations became apparent:

    1. Single-Volume Processing**

    If you inserted a disk with three partitions, the script would only process the first one. The other two? Invisible to the system.

    “`bash

    Original approach

    if [[ $StorageType =~ “Apple_APFS” ]]; then

        # Process APFS

    elif [[ $StorageType =~ “Apple_HFS” ]]; then

        # Process HFS+

    elif [[ $StorageType =~ “Microsoft Basic Data” ]]; then

        # Process ExFAT

    fi

    Only ONE branch executed, then script exits!

    “`

    2. No Logging Infrastructure**

    Troubleshooting was a nightmare. Basic `echo` statements went nowhere when run as a LaunchDaemon.

    3. No Testing Capability

    Want to test the script? Hope you don’t mind encrypting real drives, because there was no dry-run mode.

    4. Generic Error Messages

    “FileVault is disabled on disk4s1” – Great, but which drive is that? What’s it called?

    5. The Re-Prompt Problem

    User mounts a drive as read-only. Five minutes later, they insert another drive. The script triggers again and asks about BOTH drives, including the one already mounted read-only. Every. Single. Time.

    Chapter 2: The First Major Overhaul – v2.0 Enhanced (December 2025)

    ### Modernization for macOS 15+ Sequoia

    When macOS 15 (Sequoia) shipped, several compatibility issues emerged. This sparked a complete reimagining of the script’s architecture.

    ### The Two-Phase Architecture

    The biggest change was moving from a linear “process-and-exit” model to a sophisticated two-phase system:

    **Phase 1: Discovery**

    “`bash

    # Scan ALL drives

    # Detect ALL partitions

    # Check ALL volume types independently

    # Build a queue of unencrypted volumes

    “`

    **Phase 2: Processing**

    “`bash

    # Process each queued volume sequentially

    # Track what was encrypted

    # Show comprehensive summary at end

    “`

    This solved the multi-volume problem instantly. Now a disk with three APFS volumes, two HFS+ partitions, and an ExFAT partition would have ALL of them discovered and processed.

    ### The Multi-Volume Victory

    Here’s a real example of the difference:

    **Original v1.0 (Single partition found):**

    “`

    Processing /dev/disk4

    Found: disk4s1 (APFS volume “Old”)

    Encrypted: disk4s1

    Exit.

    “`

    **Enhanced v2.0 (All partitions found):**

    “`

    Discovery Phase:

    – disk29s1 “Old” (APFS, unencrypted) → Added to queue

    – disk29s2 “New” (APFS, unencrypted) → Added to queue

    – disk29s3 “Lucky” (APFS, unencrypted) → Added to queue

    – disk28s3 “Untitled 2” (HFS+, unencrypted) → Added to queue

    Processing Phase:

    [1/4] Encrypting “Old” (disk29s1)… Done

    [2/4] Encrypting “New” (disk29s2)… Done

    [3/4] Encrypting “Lucky” (disk29s3)… Done

    [4/4] Converting and encrypting “Untitled 2” (disk28s3)… Done

    Summary: 4 volumes encrypted in this session

    “`

    Volume Names Everywhere

    Every dialog, every log message, every notification now showed friendly volume names alongside technical IDs:

    “`

    Before:”Processing disk4s1″

    After:”Processing ‘MyBackups’ (disk4s1)”

    “`

    Users could finally understand what was happening to which drive.

    The LaunchDaemon Dialog Fix

    A critical bug emerged: when run as a LaunchDaemon (root), swiftDialog couldn’t accept keyboard input. Password fields were useless. The fix used `launchctl asuser` to run dialogs in the user’s GUI session context:

    Now dialogs appeared correctly with proper focus and keyboard interaction.

    Chapter 3: Production Readiness – v2.1 (December 2025)

    ### The Logging Revolution

    v2.1 introduced a professional four-level logging system:

    | Level | Output | Use Case |

    |——-|——–|———-|

    | **0 – Minimal** | Errors only | Production (silent success) |

    | **1 – Normal** | Errors + key operations | Standard deployment |

    | **2 – Verbose** | + detailed progress | Troubleshooting |

    | **3 – Debug** | Everything | Development/diagnosis |

    **Dual-destination logging** sent output to both console and macOS unified logging:

    “`bash

    log_info() {

        if [[ $LOG_LEVEL -ge 1 ]]; then

            echo “[$(get_timestamp)] INFO: $*”

            logger -p user.info “DiskEncrypter [INFO]: $*”

        fi

    }

    “`

    Now you could watch logs in real-time:

    “`bash

    log stream –predicate ‘eventMessage CONTAINS[c] “DiskEncrypter”‘ –info

    “`

    ### Dry-Run Mode: Test Without Fear

    “`bash

    sudo ./DiskEncrypter_Enhanced.sh –dry-run –log-level 3

    “`

    This became invaluable for:

    – Testing in production environments

    – Training new IT staff

    – Validating configuration changes

    – Demonstrations

    Every disk operation was logged but not executed:

    “`

    [DRY RUN] Would execute: diskutil apfs encryptVolume disk4s1 -user disk

    INFO: DRY RUN: Encryption would start for disk4s1

    “`

    ### Command-Line Arguments

    For the first time, you could override plist settings from the command line:

    “`bash

    # Debug logging without changing plist

    ./DiskEncrypter_Enhanced.sh -l 3

    # Dry-run with verbose logging

    ./DiskEncrypter_Enhanced.sh –dry-run -l 2

    “`

    Priority: **Command-line → Plist → Default**

    Chapter 4: The Bug Hunt – v2.2 (December 2025)

    ### The Critical Read-Only Bug

    A critical bug was discovered: the script couldn’t detect read-only volumes, causing users to be re-prompted endlessly.

    **The Bug:**

    “`bash

    # Wrong field name!

    volumeMountInfo=$(diskutil info “$VolumeID” | grep “Read-Only Volume:”)

    if [[ “$volumeMountInfo” == “Yes” ]]; then

        # This NEVER executed!

    fi

    “`

    **Why It Failed:**

    “`bash

    $ diskutil info disk4s2 | grep “Read-Only Volume:”

    # (no output – field doesn’t exist!)

    $ diskutil info disk4s2 | grep “Volume Read-Only:”

       Volume Read-Only:          Yes (read-only mount flag set)

    # ^^^ The actual field name!

    “`

    **The Fix:**

    “`bash

    volumeMountInfo=$(diskutil info “$VolumeID” | grep “Volume Read-Only:”)

    if [[ “$volumeMountInfo” =~ ^Yes ]]; then

        log_info “Volume $VolumeID ($volumeName) is mounted read-only, skipping”

        continue

    fi

    “`

    This simple fix eliminated the annoying re-prompt problem.

    ### NTFS Support Added

    Windows NTFS volumes were now recognized and handled:

    “`bash

    if [[ $StorageInfo =~ “Microsoft Basic Data” ]] ||

       [[ $StorageInfo =~ “Windows_FAT” ]] ||

       [[ $StorageInfo =~ “DOS_FAT” ]] ||

       [[ $StorageInfo =~ “Windows_NTFS” ]]; then  # NEW!

    “`

    Chapter 5: The Safety Revolution – v2.3 (December 2025)

    ### The Security Gap

    A realization hit: unencrypted drives were mounted read/write while waiting for user decision. This created a window where data could accidentally be written to unencrypted media.

    ### Auto Read-Only Mounting

    v2.3’s solution was elegant: automatically mount ALL unencrypted volumes as read-only immediately upon detection, before showing any dialog.

    **New Function:**

    “`bash

    mountReadOnly() {

        local VolumeID=$1

        local volumeName=$2

        log_info “Auto-mounting volume as read-only: $VolumeID ($volumeName)”

        # Unmount first

        diskutil unmountDisk “$VolumeID” 2>/dev/null

        # Mount as read-only

        diskutil mount readOnly “$VolumeID”

        return 0

    }

    “`

    **Updated Workflow:**

    “`

    v2.2:

    1. Drive detected → Mounted read/write ⚠️

    2. Dialog shown

    3. If “Mount read-only” → Remount

    v2.3:

    1. Drive detected → Auto-mounted read-only ✅

    2. Dialog shown (drive already safe)

    3. If “Keep Read-Only” → Already done!

    “`

    ### Updated User Messages

    Dialogs now explicitly stated the protection status:

    “`

    This volume has been mounted as read-only for your protection.

    To write files, you must encrypt the disk.

    “`

    Button labels changed:

    – **Before:** “Mount as read-only”

    – **After:** “Keep Read-Only” (more accurate)

    ### Comprehensive User Documentation

    v2.3 introduced `USER_GUIDE.md`, a 22KB end-user safety manual with:

    – Clear explanation of the three button options

    – Decision guide chart by drive type

    – Step-by-step backup workflows

    – Critical warnings for ExFAT/FAT32 (data loss risk)

    – Password management best practices

    **Example Safe Workflow:**

    “`

    WRONG:

    ❌ Insert ExFAT drive with photos → Click “Encrypt” → ALL PHOTOS DELETED

    CORRECT:

    ✅ Click “Keep Read-Only”

    ✅ Copy all photos to Mac

    ✅ Verify photos copied correctly

    ✅ Eject drive

    ✅ Re-insert drive

    ✅ NOW click “Encrypt”

    ✅ Photos safe on Mac, drive encrypted

    “`

    Chapter 6: The Camera Card Protection – v2.4 (December 2025)

    ### The Data Loss Catastrophe Waiting to Happen

    v2.3 was safer, but a critical vulnerability remained: users could still choose to “Erase and Encrypt” ExFAT/FAT32 drives, which contained camera cards with irreplaceable wedding photos, vacation videos, and professional photography.

    **The Risk Scenario:**

    “`

    1. Wedding photographer inserts SD card (ExFAT) with 500 photos

    2. Dialog appears: “Erase and Encrypt” option available

    3. Photographer thinks “encrypt” = “protect my photos”

    4. Clicks button, enters password

    5. ❌ ALL 500 WEDDING PHOTOS DELETED FOREVER

    “`

    This was a bad situation waiting to happen.

    ### The Bold Decision: Remove the Erase Option

    v2.4 made a controversial choice: **completely remove the encrypt option for ExFAT/FAT/NTFS volumes.**

    This volume cannot be encrypted without erasing all data.

    To protect your data from accidental loss, encryption is not

    offered for this disk type (ExFAT/FAT/NTFS).

    Why encryption is not offered:

    • Encrypting this volume type requires complete erasure

    • All existing data would be permanently lost

    • This protection prevents accidental data loss on camera cards,

      USB drives, and other portable media

    To encrypt this drive:

    1. Back up all data to a secure location

    2. Use Disk Utility to erase and format as APFS

    3. Then encryption can be applied without data loss”

    <

    Chapter 7: The UX Polish – v2.4.1 (December 2025)

    ### The Scrolling Problem

    v2.4’s dialogs had too much text—users had to scroll to see everything. This defeated the purpose of clear communication.

    **The Problem:**

    “`

    Main Dialog:

    [15+ lines of text explaining everything]

            ▼ SCROLL REQUIRED ▼

    “`

    ### The Infobox Solution

    v2.4.1 leveraged swiftDialog’s `–infobox` parameter to split content:

    “`bash

    # Concise main message (5 lines)

    customMessage=”Non-encryptable volume: **\”$volumeName\”** ($VolumeID)

    File System: **$fsType**

    $subTitleNonEncryptable”

    # Detailed infobox (collapsible)

    infoboxMessage=”### Why Encryption Is Not Offered

    • Encrypting this volume type requires **complete erasure**

    • All existing data would be **permanently lost**

    • This protection prevents accidental data loss

    ### To Encrypt This Drive (If Needed)

    **Step 1:** Back up all data to a secure location

    **Step 2:** Open Disk Utility and erase the drive

    **Step 3:** Format as **APFS** (Mac only)

    **Step 4:** Re-insert for automatic encryption

    ⚠️ **Warning:** Only proceed if you have backed up all data!”

    Chapter 8: The Feedback Loop Fix – v2.4.3

    ### The Duplicate Dialog Bug

    Just when everything seemed perfect, a new bug emerged in production:

    **The Problem:**

    “`

    1. User inserts USB drive

    2. Script displays dialog

    3. User clicks “Eject”

    4. Script executes: diskutil unmountDisk disk4

    5. ❌ Unmount event triggers LaunchDaemon AGAIN

    6. ❌ Script runs concurrently with first instance

    7. ❌ Dialog appears TWICE

    8. ❌ User must click “Eject” again

    “`

    The LaunchDaemon was monitoring both mount AND unmount events, creating a feedback loop.

    The Two-Pronged Solution

    v2.4.3 implemented two complementary mechanisms:

    **1. Lock File Mechanism (Prevents concurrent execution)**

    “`bash

    LOCK_FILE=”/var/run/diskencrypter.lock”

    **2. Processed Volumes Tracking (Prevents re-processing)**

    “`bash

    PROCESSED_VOLUMES_FILE=”/var/tmp/diskencrypter_processed.txt”

    COOLDOWN_SECONDS=30

    Chapter 9: The Spaces In Between – v2.4.5

    A few more big fixes happened in v2.4.4 and v2.4.5 which addresses spaces not allowed in the password field and the password hint field and suffice it to say that regex was the issue. And testing was required. And a second fix for the first fix when the installer was discovered to be setting an old default in the management preferences. Ooops!

    Chapter 10: Lessons Learned

    ### 1. Users Don’t Understand Technical Terms

    “Erase and encrypt” sounds like “protect my data” to non-technical users. The v2.4 solution of removing the option entirely was controversial but necessary.

    **Key Learning:** When data loss is possible, don’t rely on warnings—remove the dangerous option.

    ### 2. Edge Cases Are Real Cases

    The read-only field name bug (v2.2) seemed minor until users reported constant re-prompts. The duplicate dialog bug (v2.4.3) only appeared in production when LaunchDaemon unmount events triggered re-execution.

    **Key Learning:** Real-world testing reveals issues that synthetic tests miss.

    ### 3. Good UX Is Iterative

    v2.4’s dialogs had too much text. v2.4.1’s infobox solution reduced reading time by 66%. Sometimes the best improvement is reduction, not addition.

    **Key Learning:** Watch real users interact with your interface. Simplify ruthlessly.

    ### 4. Logging Saves Lives (and Debugging Time)

    The v2.1 four-level logging system with dual output (console + system logger) paid dividends in every subsequent version. Debug output that would have taken hours to add per-bug was already there.

    **Key Learning:** Invest in logging infrastructure early. Future you will thank present you.

    ### 5. Backward Compatibility Matters

    Every version maintained 100% backward compatibility with configuration files and user workflows. APFS/HFS+ encryption worked identically across all versions.

    **Key Learning:** Add features, don’t break existing deployments.

    ### 6. Concurrent Execution Is Harder Than It Looks

    The feedback loop bug (v2.4.3) required two complementary solutions: lock files for mutual exclusion AND processed volume tracking for cooldowns.

    **Key Learning:** Concurrent execution requires multiple layers of protection.

    Chapter 11: The Final Product

    ### What Disk Encrypter Enhanced v2.4.5 Delivers

    **For IT Administrators:**

    – Comprehensive logging with 4 verbosity levels

    – Dry-run mode for safe testing

    – Command-line arguments for flexibility

    – Automatic log rotation (30-day retention)

    – Lock file protection against race conditions

    – Processed volume tracking with configurable cooldown

    **For End Users:**

    – Auto read-only mounting (immediate protection)

    – Clear, friendly volume names in all dialogs

    – No risk of accidental data loss on camera cards

    – Clean, professional dialog layout (no scrolling)

    – One dialog per user decision (no duplicates)

    – Educational content explaining technical concepts

    **For Compliance Officers:**

    – Complete audit trail in system logs

    – Encryption enforcement for compatible volumes

    – Safe handling of non-compatible media

    – User acknowledgment tracking

    – Session-based encryption reporting

    Chapter 12: Looking Forward

    ### What’s Next?

    The evolution from v1.0 to v2.4.5 represents maturity, but there’s always room for improvement:

    **Potential Future Enhancements:**

    – Web-based dashboard for monitoring encryption across enterprise

    – Centralized reporting to MDM systems

    – Email notifications for IT admins

    – Custom encryption policies per volume type

    – Integration with company password managers

    – Support for FileVault-encrypted APFS containers on external drives

    ### The Philosophy Going Forward

    The journey from v1.0 to v2.4.5 taught us three guiding principles:

    1. **User safety over feature completeness** – Removing the ExFAT erase option protected users from themselves

    2. **Comprehensive logging over simplicity** – The debugging investment paid dividends

    3. **Iterative refinement over big-bang releases** – Each version solved real problems discovered in the field

    ## Conclusion

    What started as a 515-line utility script in October 2022 has evolved into a 1,325-line enterprise-grade security solution. Along the way, we learned that good software is never “done”—it evolves through real-world use, user feedback, and a commitment to continuous improvement.

    The DiskEncrypter journey demonstrates that the best solutions emerge from:

    – Listening to users (the read-only re-prompt problem)

    – Protecting users from themselves (the camera card protection)

    – Obsessive attention to detail (the dialog UX refinement)

    – Comprehensive testing (the feedback loop bug discovery)

    Today, DiskEncrypter_Enhanced.sh v2.4.5 stands as a cautionary tale to what iterative refinement can achieve: a tool that not only enforces security policy but actively prevents data loss, provides excellent user experience, and operates reliably in production environments.

    The code journey continues.

    **Technical Stats:**

    – **Total Development Time:** ~12 days of Xmas coding (December 3-15, 2025)

    – **Lines of Code:** 515 → 1,125 (+157%)

    – **Functions:** 2 → 25+ (+1,150%)

    – **Major Versions:** 9 (v2.0, v2.1, v2.2, v2.3, v2.4, v2.4.1, v2.4.3, v2.4.4, v2.4.5)

    – **Bug Fixes:** 4 critical issues resolved

    – **Compatibility:** macOS 15+ (Sequoia) and macOS 26+

    **Documentation:**

    – Evolution Guide: 1,844 lines

    – User Guide: 650+ lines

    **Acknowledgments:**

    – Original script by Thijs Xhaflaire (October 2022)

    swiftDialog by Bart Reardon

    – Testing and feedback from the MacAdmin community

    **License:**

    The DiskEncrypter_Enhanced.sh script is distributed for enterprise use. See individual script files for license details.

  • Community Projects: Part 1

    SetDefaultAppsX – A Community-Driven Evolution

    From Enterprise Lock-In to Universal macOS Tool

    When Scott Kendall released his SetDefaultApps script in December 2025, it solved a real problem: giving users a friendly GUI with swiftDialog to set their default applications using scriptingOSX’s utiluti for file types, URLs, and protocols on macOS. It worked beautifully—but only if you had Jamf Pro.

    That’s where the community stepped in.

    The MDM-specific Problem

    Scott’s original script was tightly coupled to Jamf Pro’s infrastructure. It relied on policy triggers for installing dependencies:


    jamf policy -trigger install_SwiftDialog
    jamf policy -trigger install_SymFiles
    jamf policy -trigger install_utiluti

    For enterprise Mac administrators already using Jamf, this was perfect. For everyone else—small businesses, education labs, home users, or shops using different MDM solutions—it was a non-starter.

    The X Factor: SetDefaultAppsX

    I took Scott’s excellent foundation and worked to make it a truly standalone tool. The “X” represents both the removal of dependencies and the cross-platform (MDM-agnostic) nature of the new version.

    Major Transformations

    1. Self-Contained Installation

    Instead of calling out to Jamf policies, SetDefaultAppsX now downloads swiftDialog directly from GitHub, verifies the package signature against Bart Reardon’s Team ID, and installs it automatically:


    expectedDialogTeamID="PWA5E9TQ59"
    LOCATION=$(curl -s https://api.github.com/repos/bartreardon/swiftDialog/releases/latest | awk -F '"' '/browser_download_url/ {print $4}')
    curl -L "$LOCATION" -o /tmp/swiftDialog.pkg
    # Verify signature before installation
    teamID=$(/usr/sbin/spctl -a -vv -t install "/tmp/swiftDialog.pkg" 2>&1 | awk '/origin=/ {print $NF}' | tr -d '()')

    No MDM required. No manual downloads. Just works.

    (Real time update: Writing this blog post made me add a feature to do the same with ScriptingOSX’s utiluti. Now the script checks and downloads both.)

    2. Hardware Detection That Actually Works

    Scott’s original used system_profiler, which sounds reasonable—until you run it in certain contexts where it returns “Unknown” due to , uh, issues. System_profiler cli is powerful, but it can time out and not always return the values I wanted. It could also be some script shenanigans but in any case.

    For my scripting needs the fix was simple but crucial: switch to sysctl queries that always work:


    # CPU Detection - Direct kernel query
    MAC_CPU=$(/usr/sbin/sysctl -n machdep.cpu.brand_string 2>/dev/null)

    # RAM Detection – Also via kernel
    MAC_RAM=$(/usr/sbin/sysctl -n hw.memsize 2>/dev/null | awk ‘{printf “%.0f GB”, $1/1024/1024/1024}’)

    The result? Instead of seeing “Unknown” or generic “chip_type”, I would see “Apple M3” on my MacBook Air or the full Intel CPU model for an Intel Mac. And it’s 20-30x faster.

    3. No Sudo Required

    The original required administrators to create directories in `/Library/Application Support` before users could run the script. Either management tools created this beforehand or users ran a script with sudo which was not ideal or workable for non-admins. So, SetDefaultAppsX includes automatic fallback:


    if [[ ! -d "${SUPPORT_DIR}" ]]; then
    echo "WARNING: Application Support directory not found"
    echo "Falling back to /Users/Shared/SetDefaultAppsX"
    SUPPORT_DIR="/Users/Shared/SetDefaultAppsX"
    # Automatically create writable directories
    /bin/mkdir -p "${SUPPORT_DIR}"
    fi

    Users can run the script immediately without any preparation. For enterprise deployments, there’s an optional `PrepareSetDefaultAppsX.sh` that sets up system-wide directories, but it’s truly optional.

    4. Modern Icon System

    Again, in my hacking of Scott’s perfectly working script, I ran into some required banner images and so instead of relying on file system icon resources that might not exist, SetDefaultAppsX uses SF Symbols:


    OVERLAY_ICON="SF=xmark.circle,weight=medium,colour1=#000000,colour2=#ffffff"

    Always available, always renders perfectly, and customizable.

    The X2 Portable Edition: Platypus-Ready

    But I didn’t stop there. I had an idea that users could run an app easily so SetDefaultAppsX2 takes portability even further—it’s designed for packaging as a standalone application using Platypus.

    The key difference is local binary detection:


    # Get the directory where the script is located
    SCRIPT_DIR="${0:a:h}"

    # Check for binaries in script directory first
    if [[ -x “${SCRIPT_DIR}/utiluti” ]]; then
    UTI_COMMAND=”${SCRIPT_DIR}/utiluti”
    elif [[ -x “/usr/local/bin/utiluti” ]]; then
    UTI_COMMAND=”/usr/local/bin/utiluti”
    fi

    This means you can package the script along with the `dialog` and `utiluti` binaries into a single app bundle with Platypus. Users double-click the app, and everything just works—no installation, no command line, no dependencies.

    Perfect for:

    • Quick distribution to non-technical users
    • Testing environments
    • Portable USB installations
    • Labs where users can’t install software system-wide

    Community Contributions Flow Both Ways

    The best part? Scott has been incorporating some of these improvements back into his Jamf-specific version. The hardware detection fixes and error handling enhancements benefit both the enterprise and standalone versions.

    This is open-source collaboration at its finest: Scott provided the excellent foundation and deep integration expertise, the community contributed cross-platform portability, and both versions improve together.

    The Technical Wins

    Let’s talk numbers:

    • Performance: 3-4x faster startup (sysctl vs system_profiler)
    • Reliability: 100% success rate for hardware detection (up from ~60%)
    • Portability: Works on any Mac, any MDM, or no MDM
    • Security: Package signature verification via Team ID
    • User Experience: No sudo required, automatic fallback, clear error messages

    What You Get

    Three versions for different needs:

    1. SetDefaultApps.sh – Scott’s original Jamf-integrated version
    2. SetDefaultAppsX.sh – MDM-agnostic standalone version
    3. SetDefaultAppsX2.sh – Portable version ready for Platypus packaging

    All three share the same excellent user interface powered by swiftDialog, the same UTI handling via utiluti, and the same goal: make setting default apps friendly and accessible.

    Getting Started

    The simplest possible workflow:

    # 1. Run the script
    ./SetDefaultAppsX.sh

    That’s it. The script downloads swiftDialog if needed, creates directories automatically, and presents users with a beautiful interface to set their default apps.

    For Platypus app building with X2:

    – Include dialog and utiluti binaries in your app bundle
    – Point Platypus to SetDefaultAppsX2.sh
    – Users get a double-clickable app with zero dependencies

    Credits Where Due

    – **Scott Kendall**: Original script author, Jamf integration expert
    – **Bart Reardon**: swiftDialog creator (the UI magic behind it all)
    – **scriptingOSX**: utiluti tool for UTI management
    – **The Community**: Testing, feedback, and collaborative improvements

    The Open Source Philosophy

    This is what makes the Mac admin community special. Scott could have kept his script locked down or enterprise-only. Instead, he shared it, accepted community modifications, and even pulled improvements back into his version.

    The result? Better tools for everyone—whether you’re managing 10,000 Macs with Jamf or helping your family set up their MacBooks.

    What’s Next?

    The scripts are stable and production-ready, but there’s always room for improvement:

    – Auto-installation of utiluti from GitHub releases (Done!)
    – Built-in default banner images
    – Dark mode support
    – Multi-language interface
    – Configuration file support for organizations

    But the foundation is solid: a truly portable, MDM-agnostic tool for one of macOS’s most user-requested features.

    Try it yourself: The full source code, documentation, and evolution guide are available in the project repository. Whether you need the Jamf version, the standalone version, or the portable Platypus version, there’s a SetDefaultApps that fits your workflow.

    Because good tools should be accessible to everyone, not just those with enterprise MDM budgets.

    *Special thanks to Scott Kendall for creating the original script and being open to community contributions, ScriptingOSX (Armin Briegel) for utiliti and to Bart Reardon for swiftDialog —the best thing to happen to Mac admin UIs in years. *

  • From Bash Script to Native macOS App: The Evolution of Simple Security Check

    Why build an app to check macOS updates?

    Managing a fleet of macOS devices through SimpleMDM often requires constant vigilance over security updates, encryption status, and OS versions. What started as a practical shell script for checking device security status evolved into a full-featured native macOS application. This is the cold/flu season inspired adventure of a crazy idea that a simple shell script could become a Swift app and live in the Mac App Store.

    With enough help from friends and current AI tools those fever dreams can become real. Join us on a long detailed rant from a 278-line Bash script to a modern SwiftUI app with secure credential management, intelligent caching, and a semi-decent and mostly functional user interface.

    Simple Security Check app with test data
    Simple Security Check app with test data

    The Beginning: A Shell Script Solution

    The original tool was born from a simple need: cross-reference SimpleMDM device data against the SOFA (Simple Organized Feed for Apple Software Updates) macOS security feed to identify which devices needed macOS updates. The shell script was straightforward but capable enough to export a spreadsheet for clients to review in a simple presentation:

    ```bash
    
    #!/usr/bin/env bash
    
    set -euo pipefail
    
    
    
    
    # Fetch devices from SimpleMDM
    
    # Compare against SOFA feed
    
    # Export CSV reports
    
    ```
    
    

    What the Shell Script Did Well

    The shell script handled several complex tasks more or less efficiently:

    1. **API Pagination**: Properly implemented cursor-based pagination for SimpleMDM’s API, handling potentially thousands of devices across multiple pages with retry logic and exponential backoff. Note: the very first version I posted didn’t do this at all, but thanks to a reminder from a helpful MacAdmin I remembered I needed to implement pagination and do it properly. Thanks!

    2. **Smart Caching**: Cached both SimpleMDM device lists and SOFA feed data for 24 hours, reducing API calls and improving performance.

    3. **Comprehensive Security Tracking**: Monitored FileVault encryption, System Integrity Protection (SIP), firewall status, and OS version compliance.

    4. **Flexible Exports**: Generated three types of CSV reports and full JSON exports with timestamps, automatically opening them in the default applications.

    5. **Version Intelligence**: Compared devices against both their current major OS version’s latest release and the maximum compatible OS version for their hardware model.

    The Pain Points

    However, the shell script approach had limitations:

    – **API Key Management**: The API key had to be entered each time or set as an environment variable—no secure storage mechanism.

    – **Single Account**: No support for managing multiple SimpleMDM accounts or environments.

    – **Limited Search**: Finding specific devices required opening CSVs and using spreadsheet search.

    – **No Visual Interface**: Everything was command-line based, requiring users comfortable with terminal operations.

    – **Manual Execution**: I had to remember to run it periodically.

    The script even had a TODO comment acknowledging its destiny:

    ```bash
    
    # to do: make into a native swift/swiftUI app for macOS
    
    # with better UX saving multiple API key entries into
    
    # the keychain with a regular alias

    “`

    The Transformation: Building a Native macOS App

    The decision to create a native macOS application wasn’t about abandoning what worked—it was about preserving that core functionality while addressing its limitations. And most importantly, being nerd-sniped by a colleague saying why not make it into a Swift app using current AI tools. I thought I could try it. How hard could it be? haha. What do I know about Swift, and what do I know about what is possible? Let’s see. The goal was clear: maintain 100% feature parity with the shell script while adding the convenience users expect from modern macOS software. And simplicity. I wanted a simple app to use to make all our lives easier. At least, this one part.

    Architecture Decisions

    The app was built using SwiftUI with a clear separation of concerns:

    **AppState.swift** – The Brain

    ```swift
    
    @MainActor
    
    class AppState: ObservableObject {
    
        @Published var apiKeys: [APIKeyEntry] = []
    
        @Published var devices: [SimpleMDMDevice] = []
    
        @Published var sofaFeed: SOFAFeed?
    
        @Published var searchText = ""
    
        @Published var showOnlyNeedingUpdate = false
    
    }
    
    ```
    
    

    This centralized state manager coordinates all data operations, making the UI reactive and keeping business logic separate from presentation.

    **KeychainManager.swift** - Secure Storage
    
    ```swift
    
    class KeychainManager {
    
        func saveAPIKey(_ key: String, for alias: String) throws {
    
            // Store in macOS Keychain with kSecAttrAccessibleWhenUnlocked
    
        }
    
    }
    
    ```

    One of the shell script’s biggest weaknesses became one of the app’s strongest features. API keys are now stored securely in macOS Keychain, never exposed in plain text, and protected by the system’s security model.

    **DatabaseManager.swift** - Intelligent Caching
    
    ```swift
    
    class DatabaseManager {
    
        func getCachedDevices(forAPIKey alias: String) -> [SimpleMDMDevice]? {
    
            // Query SQLite with 24-hour cache validation
    
            // Indexed for fast search
    
        }
    
    }
    
    ```

    The file-based JSON caching from the shell script evolved into a SQLite database with indexed search capabilities. Each API key gets its own cached dataset, and the 24-hour cache duration from the original script was preserved.

    **APIService.swift** - Network Layer
    
    ```swift
    
    class APIService {
    
        func fetchAllDevices(apiKey: String,
    
                            apiKeyAlias: String,
    
                            forceRefresh: Bool) async throws -> [SimpleMDMDevice] {
    
            // Same pagination logic as shell script
    
            // Same retry mechanism with exponential backoff
    
            // Same User-Agent header pattern
    
        }
    
    }

    “`

    The API fetching logic was ported almost line-for-line from the shell script. The same pagination handling, the same retry logic, even the same User-Agent pattern. If it worked in Bash, it works in Swift. And the User-Agent pattern came from a helpful Issue submitted in GitHub about making the shell script a better part of the SOFA ecosystem. Thanks again!

    What Got Better

    **Multiple API Key Support**

    The single biggest improvement was supporting multiple SimpleMDM accounts. IT administrators often manage multiple clients or environments. The app now stores unlimited API keys with custom aliases:

    – “Production” for your main environment

    – “Testing” for sandbox testing

    – “Client A“, “Client B” for MSPs managing multiple organizations

    Each API key appears as a tab in the interface, with separately cached data for instant switching.

    **Real-Time Search and Filtering**

    The shell script required exporting to CSV and searching in a spreadsheet. The app provides instant, full-text search across all device attributes:

    ```swift
    
    var filteredDevices: [SimpleMDMDevice] {
    
        var result = devices
    
    
    
    
        if showOnlyNeedingUpdate {
    
            result = result.filter { $0.needsUpdate }
    
        }
    
    
    
    
        if !searchText.isEmpty {
    
            result = result.filter { device in
    
                name.localizedCaseInsensitiveContains(searchText) ||
    
                deviceName.localizedCaseInsensitiveContains(searchText) ||
    
                serial.localizedCaseInsensitiveContains(searchText) ||
    
                // ... and more fields
    
            }
    
        }
    
    
    
    
        return result
    
    }
    
    ```

    Type a serial number, see the device instantly. Toggle “Needs Update” to focus on out-of-date machines. Sort by most columns with a click. Note: I did run into a limitation with the number of sortable columns in the Swift code, many iterations and trials and eventually I found something that worked. Yeah Swift!

    **Automatic Refresh with Progress**

    The shell script required manual execution. The app handles refresh automatically:

    – Background refresh respects the 24-hour cache

    – Progress indicators show API fetch status

    – Force refresh option bypasses cache when needed

    – Errors display in-app with clear messaging

    What Stayed the Same (Intentionally)

    Certain aspects of the shell script were functional and useful so they were copied in the app:

    **Export Format Compatibility**

    The CSV exports use the exact same format as the shell script:

    ```csv
    
    "name","device_name","serial","os_version","latest_major_os",
    
    "needs_update","product_name","filevault_status",
    
    "filevault_recovery_key","sip_enabled","firewall_enabled",
    
    "latest_compatible_os","latest_compatible_os_version","last_seen_at"
    
    ```

    Users who had automated workflows processing these CSVs didn’t need to change anything.

    **Output Directory Structure**

    Files still export to `/Users/Shared/simpleMDM_export/` with the same naming convention:

    “`

    simplemdm_devices_full_2025-12-11_1430.csv

    simplemdm_devices_needing_update_2025-12-11_1430.csv

    simplemdm_supported_macos_models_2025-12-11_1430.csv

    simplemdm_all_devices_2025-12-11_1430.json

    “`

    **Cache Duration**

    The 24-hour cache validity period was retained. It’s a sensible balance between API rate limiting and data freshness for device management.

    **SOFA Integration Logic**

    The algorithm for matching devices against SOFA feed data remained identical:

    1. Build a lookup table of latest OS versions by major version

    2. Match each device’s hardware model against SOFA’s compatibility data

    3. Determine both “latest for current major” and “latest compatible overall”

    This dual-version approach is valuable for planning: devices might be current on macOS 13.x but capable of running macOS 15 or macOS 26. It’s good to know.

     Technical Highlights

     Security Model

    The app runs fully sandboxed with carefully scoped entitlements:

    ```xml
    
    <key>com.apple.security.app-sandbox</key>
    
    <true/>
    
    <key>com.apple.security.network.client</key>
    
    <true/>
    
    <key>com.apple.security.files.user-selected.read-write</key>
    
    <true/>
    
    ```

    API keys use Keychain with `kSecAttrAccessibleWhenUnlocked`, meaning they’re protected when the Mac is locked.

     Data Flow

    1. **Launch**: Load API key metadata from UserDefaults, actual keys from Keychain

    2. **Refresh**: Check SQLite cache validity, fetch from APIs if needed

    3. **Process**: Merge SimpleMDM and SOFA data using the same algorithm as the shell script

    4. **Cache**: Store in SQLite with timestamp and API key association

    5. **Display**: Render in SwiftUI Table with reactive filtering

    Test Mode Feature

    A unique addition not in the shell script: a test mode that generates dummy devices for demonstrations and screenshots:

    ```swift
    
    func toggleTestMode() {
    
        testModeEnabled.toggle()
    
    
    
    
        if testModeEnabled {
    
            let demoEntry = DummyDataGenerator.createDemoAPIKeyEntry()
    
            apiKeys.insert(demoEntry, at: 0)
    
    
    
    
            let dummyDevices = DummyDataGenerator.generateDummyDevices(count: 15)
    
            // ... process with real SOFA data
    
        }
    
    }
    
    ```
    
    
    

    This allows testing the full UI without a SimpleMDM account—perfect for App Store screenshots or demos. And it turns out a requirement for the App Store review process since the alternative was giving them API keys to real data to test with, which I could not do, of course, and which brought us to generating test data. A perfect plan.

     Lessons Learned

    What Worked

    **Preserve the Core Logic**: The shell script’s API handling, caching strategy, and data processing were good enough and worked. Porting them to Swift rather than redesigning saved time and avoided regressions.

    **Prioritize Security from Day One**: Building Keychain integration first made everything else easier. API keys are sensitive, and getting that right early prevented technical debt.

    **SwiftUI for Rapid UI Development**: Building the table view, settings panel, and navigation in SwiftUI was dramatically faster than AppKit would have been. But since my experience was using an app like Platypus for simple app creation using SwiftUI was definitely more flexible and possible with help from current tools.

    What Was Challenging

    **Async/Await Migration**: The shell script’s sequential curl calls had to become proper async Swift code with structured concurrency.

    **SQLite in Swift**: While more powerful than file caching, setting up proper SQLite bindings and schema management added complexity. and app sandbox rules moved the location of the cache and added a wrinkle in testing.

    **Tab-Based Multi-Account UI**: The shell script only handled one API key. Designing an intuitive interface for switching between multiple accounts required several iterations.

     Performance Comparison

    **Shell Script**:
    
    - Initial fetch: ~8-12 seconds for 100 devices
    
    - Subsequent runs (cached): ~2-3 seconds
    
    - Search: N/A (requires opening CSV)
    
    
    
    
    **Swift App**:
    
    - Initial fetch: ~8-12 seconds for 100 devices (same API calls)
    
    - Subsequent launches: <1 second (SQLite cache)
    
    - Search: Real-time (indexed database queries)
    
    - Switching API keys: Instant (cached data)

     The Result

    The final application preserves everything that made the shell script valuable while transforming the user experience:

    – **Same data, better access**: All the security metrics, none of the manual CSV searching

    – **Same exports, more secure**: Identical CSV format, Keychain-protected credentials

    – **Same caching, faster searches**: 24-hour cache retained, SQLite indexed queries added

    – **One account to many**: Support for unlimited SimpleMDM accounts. Good for testing.

    – **Terminal to GUI**: From command-line to native macOS interface

    The app isn’t just a shell script wrapped in a window—it’s a giant leap into Swift app production which challenged me enormously for troubleshooting and app testing. This app is a small step in the code adventures that await us all when we want to take an idea, from shell code to Mac app.

     Future Enhancements

    While the current version achieves feature parity and then some, there’s room to grow:

    – **Scheduled Auto-Refresh**: Background fetching on a schedule

    – **Push Notifications**: Alerts when devices fall out of compliance

    – **Export Automation**: Scheduled exports to specific directories

    – **Custom Filters**: Save filter configurations for different report types

    – **Device Groups**: Tag and organize devices into custom categories

    – **Trend Analysis**: Historical tracking of fleet compliance over time

    This is not the end

    The journey from shell script to native app demonstrates that nerd-sniping does work and we can be pushed to try new things. The shell script’s core logic—its API handling, caching strategy, and data processing—was already ok, somewhat decent, and at least functional. The leap to all Swift was about making that functionality more accessible, more secure, while making testing and troubleshooting more difficult and confusing, but also a valuable learning opportunity. Xcode 26.1 has some basic code fixing abilities that we tested many times. It helped!

    For IT administrators managing Mac fleets, the app delivers what the script did (device security monitoring and reporting) with what the script couldn’t (multi-account support, instant search, secure credential storage, and a native interface).

    The script’s final TODO comment has been fulfilled:

    ```bash
    
    # to do: make into a native swift/swiftUI app for macOS
    
    # with better UX saving multiple API key entries into
    
    # the keychain with a regular alias
    
    ```

    ✅ Done.

    **Simple Security Check** is available for macOS 15.0 (Sequoia) and later from the Mac App Store. The original bash source code and architecture documentation can be found in the project GitHub repository.

    *Built with SwiftUI, powered by the same bad logic that served IT admins well in its shell script form, now with the wild woodland scent of a native macOS application.*

  • Don’t stop!

    Using a REST API to code a bunch of useful apps

    Background: I released some scripts to help users manage Archiware P5 servers on code.matx.ca with a first blog post as a background to the scripting approaches of cli vs REST API and then I discussed my cute Platypus built apps and my first swift/swiftUI Mac apps… so of course now I’m going to discuss two new Mac apps that use a REST API to explore a P5 Archive.

    I’m building tools to help me and my clients work with Archiware’s amazing and awesome P5 Archive product. It’s great. It archives, it restores, it has a great web UI, and it’s always getting better. So why build apps? Sometimes you want something different, in my case my clients wanted spreadsheets. Yup. Data in a sheet. To look at. So I poked at P5’s databases via cli (see P5 Archive Manager) and with the REST API. Here are the results, two new apps P5 Archive Overview and P5 Archive Search.

    https://code.matx.ca/ is code on GitHub + Mac apps that help manage data in Archiware P5

    Why API? And not cli

    Usually the cli (command line interface) is perfect for working in Terminal and in shell scripts or other programming languages. Using an API or application programming interface, allows different software applications to communicate and share data with each other. Instead of cli commands and arguments programs make requests using specific methods like GET or POST to retrieve or send data. See: API on Wikipedia

    Using an API for my Mac apps means they could use a protocol like HTTP, and make a request using GET (to retrieve data).

    The magic parts of an API

    • API Client: The application making the request.
    • Endpoints: Specific URLs where the API can be accessed.

    For the P5 Archive Overview app we need to use the specific API Endpoint for archive overview detailed by Archiware here which lucky for us is a simple call for data which our app can display, save as json, format as csv (for the spreadsheet!!) and stash in a SQLite database for historical searches.

    However, the P5 Archive Search app has to make many calls to walk through the index tree which is an inventory by path of the files archived. So we ask the user to name the storage they want to search and we make a breadcrumb through the storage, storing everything in SQLite as well as saving json and csv snippets of what we find. A lot more API calls but perfect for an app to do in the background.

    The P5 Archive Search app queries the P5 Archive Index Inventory REST API:

    “`

    GET http://{server}:{port}/rest/v1/archive/indexes/{archive-index}/inventory/{path}

    “`

    Example:

    “`

    http://192.168.1.100:8000/rest/v1/archive/indexes/Default-Archive/inventory/Volumes/BigStorageSMB

    Made for Macs, macOS 14 and up. Bring your own jq

    When running macOS 15 and up jq is installed for free and can help make the csv files form the json, but if you’re running macOs 14 then you need to install it with homebrew, MacPorts, or on your own.

    One funny story during the early testing, I had to fix the jq detection in my first version because it totally missed reading the unix path correctly and yeah, thanks to a friend on the MacAdmins Slack who had an issue when trying it out. I was able to go back and fix that. Friends help friends make better code.

    I’m not done making apps, and I’ll keep tweaking these three I have so far and making new ones from the scripts I have. The goal is to manage data, get a better understanding of the data in the archives and let the clients and owners of the data to know what they have.

    Stay tuned.

  • Swiftly make an app

    This is a blog post about an app, there are many of them but this is mine. How I made a swift app in Xcode but took the long way ’round to get there. And an even longer post to tell the story.

    P5 Archive Manager, an app I created with AI tools to help check files archived with Archiware P5

    I recently posted about some scripts I released on github to help other admins using Archiware P5 archive software to manage their servers and the data in the archive vault.

    Claude: “How can I help you today?”

    Using free credits in various AI tools will get you pretty far, but how far, really? Can you make an app? A useful app? Maybe, yes. Encouraged to transform some simple shell script projects and make a “proper” swift / swiftUI based Mac app I started using Claude as a test. The result is two apps for checking Archiware P5’s archive with a drag and drop tool that verifies files with local or remote servers. Code and apps on GitHub.

    # Path to nsdchat
    chatcmd="/usr/local/aw/bin/nsdchat -c"

    Working with Microsoft’s Visual Studio Code app with co-pilot (tied to my GitHub account) and occasional ChatGPT coding sessions, when co-pilot ran out of free credits, I was surprised that I made a lot of progress on some more complicated scripts.

    ChatGPT: “Ask anything”

    It all started with shell scripts to help me automate some tasks with managing data and eventually I wanted to create a Mac app (or two) for my clients and me to use. I like scripts, but I have so many. And solving a problem in terminal and sifting through all those amazing scripts became more and more complicated, so maybe I needed a nice app that you can click and launch and be done with it to do that one thing. Single minded apps for single purpose objectives.

    Platypus created app. Check if files are archived by drag and drop

    I asked my friendly AI super tool to suggest a way to make an app from a shell script to see if it had any ideas, and it did have a few, although Mac specific ones were not plentiful. Looking through a list of possible methods, it suggested I use the excellent and awesome Platypus app. Now that’s a name I haven’t heard of it a while.

    Platypus app UI. Pick a script, choose an interface and create an app.
    #!/bin/sh
    
    alias nuke="/Applications/Nuke5.2v1/Nuke5.2v1.app/Nuke5.2v1"
    export NUKE_PATH="/Volumes/XSAN/District9/Foundry/Nuke/"
    export OFX_PLUGIN_PATH="/Volumes/XSAN/District9/Foundry/OFX/"
    
    /Applications/Nuke5.2v1/Nuke5.2v1.app/Nuke5.2v1

    I used it many years back in VFX when working with Nuke and other pipeline tools but now I really needed it. Lucky for me it is still around and still works great. So I built a bunch of small Mac apps wrapped around simple shell scripts. Many of my scripts for P5 acted on a path (ie what folder of files do you need to examine) and making drag and drop app in Platypus was incredible easy. Add swiftDialog in the mix and you get nice messages to communicate progress.

    Platypus app with embedded files, including swiftDialog.

    These Platypus born simple apps worked well but I wanted to sign and notarize them and honestly that’s always fun in part because Apple, like every company, changes the way to do it all the time so I often stumbled at this step and looked for more helper apps and guidance.

    I previously used SD Notary Tool and while I had success in the past I got stuck somewhere in the process and couldn’t figure it out. Then I remembered someone had posted in the MacAdmins Slack about a cool new app they built called Signaro to help sign apps. I had looked at it and was initially confused (lots of buttons and options!) but now I needed it. So I tried again and it really helped.

    Signaro app for signing Mac apps, and notarizing and the whole app distribution workflow

    At first I was stuck in the same step, and it was the app specific password that was my first main issue. I couldn’t figure it out and so re-created it with Apple and still nothing, then one more time and for some reason it worked. Success!

    Small side story: I did find a small bug in Signaro when entering the AppleID and App password that I reported and it was fixed right away. In the process of submitting the bug I chatted with the author and I learned about the keychain profile option to make this step easier so thanks again for the helpers in this community.

    xcrun notarytool store-credentials MacVFX
    
    This process stores your credentials securely in the Keychain. You reference these credentials later using a profile name.

    Another confusing part of signing and notarizing is that there are so many certificates you need so I was super pleased to have Signaro identify that I was using an incorrect cert and suggested how to fix the issue. The step by step process and the clear messaging is a big win in this app. It’s a Swiss army knife for signing, notarizing, making DMGs, and all the steps for preparing your app for distribution. Huge shout out to the author.

    Signaro app certificate check.

    So I’m pretty setup I have all the free AI coding tools to smash on my bash shell scripts, I have Platypus for creating the Mac app, bundled swiftDialog and other assets, and lastly a smooth signing and notarizing workflow with Signaro to distribute the apps. What else do I need? What about a native swift app built in Xcode?

    Again talking with the friendly author of Signaro he suggested why not make a swift Xcode app with some AI super tool? Hmm, I thought. Ok, and 3 hours later I’d built something that didn’t work and my “free credits” had run out. Surprisingly, Xcode has its own AI tool helper built-in to macOS 26 and I found it quite useful to fix small issues.

    Xcode has built in AI tools to fix small mistakes.

    When I ran into my first coding road block and free AI credit exhaustion, I also realized I needed lunch and some fresh air, even maybe a break from coding at my desk. As I took a short break I realized I could troubleshoot the old fashioned way and figure out what was not working while I waited for more “free credits” to re-materialize (or I could sign up!). After food and rest it only took me a few minutes to realize that my app did in fact work correctly, but I was testing it incorrectly!

    So maybe you need fresh air and a break from the keyboard occasionally. Truth time. I finally did sign up for a month to test Claude out some more and it is great, but it also gets tired and reaches a maximum length of a conversation. Everyone needs fresh air.

    Even with a simple app, and before running out of credits, you reach max length of conversation.

  • Archiware P5 scripts

    Background: I manage a few backup and archive servers for clients and these run the Archiware P5 suite of software (archive, backup and sync). To help manage these servers over the years I’ve written some P5 monitoring tools for Watchman and MunkiReport as well worked on helper scripts using the cli tools or more recently the API.

    In an effort to share some simple examples of what is possible I have organized a few samples from my GitHub repos on the code.matx.ca page with some useful descriptions and text about usage and purpose of the scripts. They are hosted on in repo here:

    https://github.com/macvfx/Archiware

    I have more scripts of general and specific interest in my repo to P5 users or anyone managing files. See this post on find scripts.

    The P5 code toolbox

    The following P5 scripts are just a few examples and I have more to share, if there’s interest. I created most of these simple tools initially to run on the P5 server directly but I have since created, for my clients, versions which run from anywhere. Also, in some cases, a few scripts have now been built into easy to use Mac applications where it makes sense. If you want some help and you want to hire me to help with these things please reach out.

    The scripts are in three categories: 

    1) P5 archive intelligence (all archive jobs from Db exported as a spreadsheet, or get recent archive jobs via REST API),

    2) P5 housekeeping (make all full tapes read only, show all appendable archive tapes), and 

    3) P5 info backup (export all volumes into one csv, and export all volumes inventory as TSV with barcode as the name)

    P5 Archive Intelligence

    What do I mean by “archive intelligence”? Simply, I want to know about everything I’ve archived. One should consider the P5 Archive server as the ultimate source for all things archived but in some cases my clients don’t use the P5 server directly, or they want the information organized differently, like in a spreadsheet. And while the Archiware P5 suite of software is ever evolving, growing and adding features (even some lovely visual dashboards in v.7.4) I have been attempting to solve the perennial question of “what do we have archived?” in better and more useful ways.

    P5 Information Backup

    Related to archive intelligence is knowing what is in my P5 archive system entirely. I modified a provided shell script from the Archiware cli manual to output a csv of a list of all P5 volumes in the tape library (aka jukebox) so that I might know what is in the system at all times, and even if my P5 server is not running I have a record of every tape. This is one of the scripts I run with my periodic and backup workflows but more on my own special P5 backups (backups of the P5 Db and other metadata) in another post. The Archiware provided P5 volume list script inspired my own script to list full and appendable archive tapes which I have as a one-click desktop app for my clients. When they want to restore something P5 will tell them what to put in the tape library but if you have a lot of tapes maybe you want to know what to remove and so I have a list of candidates (ie take out the full tapes, and leave the appendable archive tapes). Helpful, yes.

    P5 Inventory

    There is a P5 cli command to export out the complete inventory and depending on how long you’ve been archiving and how much is in your archive this tool can take a long time to export a list of every file ever. And because of my mostly non-advanced super skills some times I’d find this process would time out. (There are ways around this but that’s another post). Basically, Too much archive! When it didn’t error out I had a big file… so one day a friend of mine suggested we use Jupyter notebook and yes, use Python!, to do some data analysis. A really fun project, great tool, but this is a hard problem to solve. We made a thing, it worked for a while then I wanted to find a better way. People liked my bar graphs and total amount archived but they also wanted spreadsheets. So let’s give them what they want.

    Two (or three) approaches:

    • cli
    • api
    • db

    I like lower case letters which are acronyms but let’s explore further.

    cli

    Using the cli (command line interface) usually in a shell script (but also in clever Mac apps) typically requires running the script with the Archiware P5 cli (nsdchat) locally on the P5 server and certainly this is what I did when I was testing scripts and various tools. It makes sense if you’re administering a server and you remote in (ssh or screenshare) and that’s where you start. After I wile I discovered the trick to make these cli based scripts run from anywhere which was handy if I wanted to connect to all my p5 servers at once in a script or have my client use an app on their desktop which talks to the P5 server. More on awsock in another post.

    An example of a cli script using nsdchat I have a script for taking the inventory contents of each archive tape (LTO) and writing its contents to a TSV file (tab separated values) which is like a CSV (comma separated values).

    nsdchat Volume names

    Give me a list of p5 volumes (ie tapes)and tell me which one are archive tapes and that are readonly and what is the barcode.

    api

    Ok instead of a cli command dependent on the nsdchat binary installed somewhere we are doing http (web) magic with the API (application programming interface) — a set of known commands in a path based on GET, PUT, POST, DELETE. The API has a different way of doing things than the cli but you can ask a lot of the same basic questions.

    In my api-archive-overview script I am sending one command then using jq to select elements to organize the info into a csv (spreadsheet). This example is set to run locally but this is easily modifiable to run from anywhere. For one client I have a script that talks to every P5 server, each in a different city and asks them all what they’ve been archiving then organizes it all into one spreadsheet. It’s fun, and it’s helpful.

    db

    The best for last. I mentioned above my attempts to use the inventory command which itself goes to all the relavant databases and gathers all the requested data about every archive file, its size and when it was archived etc. Yeah, that’s one way to do it. I’ve shown two examples above for the cli and api but a third is to just talk to the database directly. This is an advanced technique and should only be attempted by an expert. Ok, I’m kidding. As long as you’re not writing to the database and only reading from it, this is pretty safe. What you do with the info is another magic trick and one which I’ve been working on. Two db examples are dump all jobs in a csv file and the second, dump only archive jobs. I’ve got a more advanced script which takes the data and uses sql commands to organize into a csv of how much data archived per day per week per month per year and totals, which is nice for some people who like spreadsheets and want to know about everything ever done. Caution: once you look into the Db you’ll see a lot of things, and sorting through it takes time. I found when making more advanced and selective scripts that the cli jobs used by the very old P5 Archive app (by Andre Aulich) for example showed up as system jobs not archive jobs, so you have to be careful if you want to include those. Have fun.

    P5 Housekeeping

    Finally, some housekeeping scripts are included in the example repo, like the script to make all archive tapes that are “full” to be marked “read only” which is handy is you also have a script to only export the contents of each tapes from archive tapes marked readonly so many little scripts to do little things.

    P5 Archive Prep scripts

    There’s another category of scripts which I haven’t elaborated on but I do have a few examples in my repo. I do have scripts to prepare or examine files and folders going to be archived to LTO with P5 Archive. These scripts do various things like check for trailing space in the name or check file name length but maybe the most important ones I have are scripts which take the path of the archived projects and create html maps, file size directory listings and spreadsheets (again!) of the exif data of all files to keep for future. Clients do refer to the archive stub files (p5c) but they also find it handy to see the directory map and the file size of archived items without going into the p5 server. I’m not trying to replicate the P5 server, or replace it, but this falls into p5 housekeeping and p5 information backup.

    That’s enough for now. If you’ve been reading and following along then let me know if you any questions or want any help with a P5 or other related projects. If you have better ways to do these things feel free to share. My scripts are always evolving and I love to learn.

    Reference:

    For more info on Archiware P5 scripting and building code to interact with it I’d recommend checking out the main P5 manual, as well as the CLI (command line interface) manual, and the API documentation, knowledge base (support). As well as the sample scripts and the Archiware blog, and the video series generally.

  • Add header here

    Or remove it, up to you,

    Had some fun creating a longer script to add a text header to some shell scripts, then because I wrote the wrong thing to all my shell scripts I had some more fun tweaking my script to find and remove this header. I’ve added it to my GitHub repo with a couple of other scripts based on the find command, one of my favourite unix tools since it is so handy.

    The script that should be a Unix one-liner: add (or remove) a header

    Some of the other example scripts based on find might be of interest to some, such as the

    File Name Check

    Especially important with certain filesystems (certain encrypted filesystems) with file name “length limits”. So why not check for these files and zip them up and put them aside for safe keeping. In practise, the only files which push this limit are downloaded (purchased) from stock photos sites and write the file name with every keyword. Nice, but why can’t we have standard metadata handling these days? (I can dream!)

    Archiware P5

    The last two scripts I made with Archiware P5 in mind, as I manage many servers for clients with P5 Archive and I really do love this software and the team. More Archiware P5 inspired scripts are in other repos here or on my main P5 code site

    Find A Trailing Space

    In this case, besides it just being nice to clean up folder names with invisible trailing spaces before the archive job it was also necessary when using the P5 Companion (desktop) app which will not archive a top-level folder with a space at the end of the name.

    Make (Only) One Thumbnail

    This script makes only one image thumbnail per RDC folder, as they normally have a lot of R3D files which are part of the same shot. Also, I don’t want a lot of duplicates and only one is enough.

    And while yes technically P5 Archive can make thumbnails and proxy videos when it is archiving (and I do use this feature) making proxies of RED files is an intensive process for older computers which means taking a long time, so pre-processing these R3D files ahead of time on faster computers can make the final archive job quicker. As part of some pre-processing before archiving to LTO (or wherever) is making sure some formats like R3D (aka RED) files have a thumbnail which will then end up in the P5 Archive created Archive index.

  • Steal This Idea

    check all your Macs at once with SOFA feed

    Note: this blog post relates to the previous one where I introduce the scripts to check SimpleMDM devices and compare with latest version info in the SOFA feed here: Use the SOFA feed to check if SimpleMDM devices needs updates

    Ok, please steal this idea. The idea? To check all your Macs at one time, instead of each device, on device, one at a time.

    What do I mean? Well, when I first heard about the SOFA feed which contained all the latest versions I didn’t know what to do with it honestly but soon after I realized that my clever script for checking XProtect version and which I made into a custom attribute in SimpleMDM and added to the dashboard was an incomplete idea.

    Ok, I’m smart, I got the XProtect version on each Mac by running a script and then I got SimpleMDM to display it in a dashboard. But what’s missing? Context. Is it the latest version or not? So I added a SOFA check to the script then made SimpleMDM display both the local version and the latest version so I’d know if it was the latest or not. Great, right? Well, maybe.

    The problem, I realized is that I wanted to do this for the macOS version too because I wanted to share info with a client/manager etc and realized the list of devices and info about macOS versions for example, lacked the context of whether it was the latest, and should we take action or not. That’s the point, right? collect info then do something about it, if action is required. Update your macOS now.

    And then I wondered why I’m getting every Mac to ask itself what is its macOS or XProtect version, etc, when SimpleMDM was asking a lot of those questions already and putting it in a dashboard, accessible via API….

    Then it happened, the idea that should be stolen by SimpleMDM and all other management tools. Don’t just display info about a Mac’s macOS version, show the latest version next to it, because I want to know if it should be updated. And also what is the latest that Mac can upgrade to. Maybe it’s running macOS 13.6, is that the latest or is 13.7.7, no wait it changed again, it’s 13.7.8. And by the way the latest compatible upgrade is 15.6.1, now that’s useful info.

    product_nameos_versionlatest_major_osneeds_updatelatest_compatible_oslatest_compatible_os_version
    Mac13,114.7.414.7.8yesSequoia 1515.6.1
    MacBookPro17,114.6.114.7.8yesSequoia 1515.6.1
    Mac13,215.615.6.1yesSequoia 1515.6.1
    iMac21,115.515.6.1yesSequoia 1515.6.1
    MacBookPro17,113.613.7.8yesSequoia 1515.6.1

    References:

    Check SimpleMDM device list and compare macOS version vs SOFA feed latest

    XProtect check version compared to latest SOFA

  • Use the SOFA feed to check if SimpleMDM devices needs updates

    I wrote a “simple” bash script to check SimpleMDM device list by API and check if any devices need updates and/or are compatible with the latest macOS. Of course, it will output some CSVs for fun and profit. Send to clients, managers, security professionals and be well.

    Note: It was a quick hack and for reasons I made 3 output CSVs for testing various presentations of the data that combines the full SimpleMDM device list and matches the macOS with available updates and max supported versions. There may be errors or omissions. Please test. Use and modify. I know I will. This is a test. Just a test.

    The script is in my GitHub repo

    Fetching SimpleMDM device list...
    Downloading SOFA feed...
    ✅ Exported:
      → Full device CSV: /Users/Shared/simplemdm_devices_full_2025-07-30.csv
      → Outdated devices CSV: /Users/Shared/simplemdm_devices_needing_update_2025-07-30.csv
      → Supported macOS per model: /Users/Shared/simplemdm_supported_macos_models_2025-07-30.csv
    ✅ Export complete.
    

    References:

    SOFA MacAdmins Getting Started

    https://sofa.macadmins.io/getting-started.html

    https://github.com/macadmins/sofa/tree/main/tool-scripts

    SimpleMDM API docs

    https://api.simplemdm.com/v1#retrieve-one-dep-device

    squirke1977 / simpleMDM_API

    https://github.com/squirke1977/simpleMDM_API/blob/master/device_details.py

  • Dynamic Groups – SimpleMDM tricks and tips part2

    When we last left our hero the big news was the discovery custom attributes and running scripts to test for certain conditions in SimpleMDM, like “is the firewall on” to post in the main dashboard was all the excitement, this year we present “dynamic groups” which in combination with custom attributes or by itself ups the game to the next level. Keep up!

    What if we wanted to know what is the current version of XProtect across the Mac fleet? and what if this wasn’t collected by default by MDM tool, in my case, SimpleMDM. Well, I can write a script to collect this info, for my purposes I’ve chosen to use silnite from Howard Oakley of eclectic light co fame and write the version number to a custom attribute. The next step is use SimpleMDM’s new dynamic groups (in preview, at the time of this blog post), and then I can watch the result filter in with a special group watching for “is matching this version” or the opposite “is not this version”. Just depends on what you want to act on or how you want to see the information. The new dynamic groups is the exciting part. I’m sooo excited.

    The custom attribute

    Screenshot

    Setting up a custom attribute of “XProtectV: and a default value of “Version Unknown” should be done before the script runs. If I get the default result then the script didn’t run or some other reason.

    The code

    #!/bin/bash
    LOG_DIR="/Users/Shared"
    DATE=$(date +"%Y-%m-%d_%H-%M-%S")
    LOG_FILE="$LOG_DIR/silcheck-log-$DATE.txt"
    /usr/local/bin/silnite aj > "/Users/Shared/silnite-xprotectv-$DATE.json"
    XPROTECTV=$(/usr/bin/plutil -extract XProtectV raw "/Users/Shared/silnite-xprotectv-$DATE.json")
    echo "$XPROTECTV" | tee -a "$LOG_FILE"
    

    The simple script writes a log into /Users/Shared just because I want to and uses the silnite binary to write out the XProtect info and plutil to extract the info from the json Note: you could also use jq in latest macOS 15 but this way is more compatible across macOS versions for now. The XProtect version is saved as an attribute which SimpleMDM picks up and reports back to base.

    The dynamic group

    Screenshot

    The filter headings are a little cut off in the screenshot but it basically says choose from all devices, refer to the custom attribute I set of XprotectV and makes sure the value equals the latest (at blog post writing) 5297 and further filter results for devices last seen in the last day. If I had switched it to the not equal to version 5297 I would see all the devices not up to date. And it’s easy to change on the fly. Easier than refreshing the main device dashboard page to see these results as I was trying to do previously and that method made it hard to further filter.

    The exciting part

    Yes the best part is to set up a job in SimpleMDM that runs the scripts on the devices to refresh the value of XProtect (I have it set to recurring as well) and then watch the results roll into a dynamic group which has its members populate as the scripts runs and results report back. Easey peasy.

    Screenshot

    Addendum:

    Adding an example screenshot to show how you can change the filter from matches an exact value of XProtect, in this example, to “not equal to” to see all the devices that haven’t upgraded yet. It’s as easy as changing the filter and clicking on “staging filter changes” button. Et voilà !

    Updated: May 16, 2025 – 19h00 local time