Apple will pay up to $1 million to anyone who finds a privacy flaw inside Apple Intelligence

Apple Intelligence logo on iPhone
(Image credit: Shutterstock)

Apple made a very big deal about Apple Intelligence’s privacy credentials when it launched the AI suite earlier this year. There has been some skepticism about those claims, especially from people like Elon Musk who took particular offense to Apple’s partnership with ChatGPT. But now Apple is putting its money where its mouth is, launching the first Apple Intelligence Bug Bounty.

Specifically, Apple is inviting hackers to investigate the Private Cloud Compute (PCC) feature. While on-device AI is inherently more private because all the data stays on the phone, cloud-computing is a different matter. PCC is Apple’s attempt to fix that issue, and offer cloud-based AI processing without compromising on data security and user privacy.

In the case of PCC, Apple is offering various rewards depending on the issue reported, but the maximum has now been increased to $1 million. That sum is only available for “Arbitrary code execution with arbitrary entitlements” during a “remote attack on request data." That should tell you how seriously Apple is taking this, or how confident it is that PCC is secure.

More from Tom's Guide

Tom Pritchard
UK Phones Editor

Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.