Apple’s “Blockchain-ish” Approach to AI and User Privacy
Apple is taking a deep dive into AI with its new Apple Intelligence suite of features across iPhones, iPads, and Macs. While some requests will, with user permission, be routed to OpenAI’s ChatGPT, a top exec says Apple’s own AI services rely on a “blockchain-ish” model to ensure user privacy.
What Does It Mean?
Following Apple’s keynote presentation, its Senior Vice President of Software Engineering Craig Federighi and SVP of Machine Learning and AI Strategy John Giannandrea took the stage for a press interview about the consumer tech giant’s big AI push, moderated by content creator Justine “iJustine” Ezarik.
Ensuring User Privacy
Asked how Apple would make sure its customer information remained private if it leaves their devices, Federighi explained that user requests sent to Apple servers are anonymized, as their IP addresses are masked and the server itself is prevented from keeping a log of information. On top of that, an image of the server software will be publicly shared so that it can be audited by independent security researchers, and user devices will only interact with servers with auditable software.
A “Blockchain-ish” Model
“It’s a clever kind of blockchain-ish attestation log to make sure the iPhone will only trust the software that’s been publicly put out there,” Federighi said, adding that Apple will soon issue a white paper about its security model. “It’s a really extraordinary step up in terms of the level of trust you can place in server computing.”
What It Means for Users
As the company dives deeper into AI and the personal data it uses to deliver its services, he added, “it’s essential that you can know that no one—not Apple, not anyone else—would have access to any of the information used to process your request.”
Private Cloud Compute (PCC)
Apple shared some details in an extensive security blog post on Monday about Private Cloud Compute (PCC), which Apple designed to handle user requests. As Federighi noted, it anonymizes user requests and then relies on servers that use publicly auditable software to handle those AI needs.
Verifiable Transparency
Apple calls that last point “Verifiable Transparency.” Sound familiar?
In essence, Apple will make its software images public to prove that its system is secure. Then, it will build a check into the system to ensure that devices can only interact with servers that use software that’s been shared publicly for auditing purposes. And that system is built with cryptography.
Public Keys and Nodes
“This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software,” the post reads, with a bullet point further emphasizing that Apple will be “publishing the measurements of all code running on PCC in an append-only and cryptographically tamper-proof transparency log.”
What It Means for Crypto-Natives
Public keys? Nodes? A “cryptographically tamper-proof transparency log,” of all things? It’s not difficult to understand why Federighi would call it a “blockchain-ish” approach. It’s also easy to see why crypto-natives are taking shots at Apple on social media for not actually calling it a blockchain when it apparently shares so many commonalities.
Conclusion
Apple’s approach to AI and user privacy may not be a traditional blockchain, but it’s clear that the company is drawing inspiration from the technology. By anonymizing user requests and using publicly auditable software, Apple is attempting to build a system that is transparent and trustworthy. Whether or not this approach is sufficient to alleviate concerns about user privacy remains to be seen.
FAQs
Q: What is Apple’s “blockchain-ish” approach?
A: Apple’s approach is a system that anonymizes user requests and uses publicly auditable software to handle those AI needs. It’s designed to ensure user privacy and trust in server computing.
Q: Is Apple’s approach a traditional blockchain?
A: No, Apple’s approach is not a traditional blockchain. While it shares some similarities with blockchain technology, it is a centralized system that is controlled by Apple.
Q: What is Private Cloud Compute (PCC)?
A: PCC is a system designed by Apple to handle user requests and ensure user privacy. It anonymizes user requests and uses publicly auditable software to handle those AI needs.
Q: What is Verifiable Transparency?
A: Verifiable Transparency is a system that makes Apple’s software images public to prove that its system is secure. It also builds a check into the system to ensure that devices can only interact with servers that use software that’s been shared publicly for auditing purposes.
Q: Is Apple’s approach sufficient to alleviate concerns about user privacy?
A: Whether or not Apple’s approach is sufficient to alleviate concerns about user privacy remains to be seen. The company’s approach is designed to be transparent and trustworthy, but some may still have concerns about the level of control Apple has over the system.