Tim Cook privacy speech to the IAPP sticks to generalities

As promised, Apple CEO Tim Cook’s privacy speech was the headline address to the International Association of Privacy Professionals, describing the fight for privacy as “one of the most essential battles of our time.”

Cook kept his remarks general, talking only about the importance of privacy and Apple’s own commitment to it – together with a lot of plugs for the company’s own privacy features. There was no update on the company’s CSAM scanning plans, which were put on hold due to privacy concerns.

Cook said that technology had the potential for both good and harm, and it was up to all of us to take the right path of the two possible realities in front of us.

Cook argued that privacy is not just a matter of safety and respect, but it’s also essential to allow people to be who they are, and to take risks.

Cook said Apple was fighting hard for privacy on multiple fronts. In a spin on the military-industrial complex expression – describing the way that governments and arms suppliers have a symbiotic relationship at the expense of humanity – the CEO described the threat from what he called the ‘data-industrial complex.’

He said that we would never accept such surveillance levels if it were visible – someone physically following us, watching us, photographing us – and called for a federal privacy law.

This was the only point in his speech which was interrupted by applause – and the impact of it was somewhat dulled when he followed this by repeating Apple’s objection to sideloading apps, or allowing competing apps.

You can watch the entire 14-minute speech in the feed and read the full transcript below.

Thank you. Good morning. It’s a privilege to be here with all of you in the nation’s capital. And I must say, it’s so nice to finally gather in person.

I want to thank the IAPP for bringing us together and for the vital work you do every day. And thank you, especially, to Trevor for your leadership.

The fight to protect privacy is not an easy one. But it is one of the most essential battles of our time. And we at Apple are proud to stand alongside all those who are working to advance privacy rights around the world.

As a company, we are profoundly inspired by what technology can make possible. But we know, too, that technology is neither inherently good nor inherently bad. It is what we make of it. It is a mirror that reflects the ambitions and intentions of the people who use it, the people who build it, and the people who regulate it.

Out of this paradigm have grown two disparate, coinciding realities: one where technology unlocks humanity’s full creative potential and ushers in a new era of possibility, the other where technology is exploited to rob humanity of which which is foundational: our privacy itself.

And that is a loss we cannot accept.

Because it is our privacy that gives us the freedom to explore different ideas. To indulge our curiosity. To dream big and take chances and make mistakes.

It is privacy that lets us be — and become ourselves without being afraid that our every move will be seen, recorded, or leaked.

A world without privacy is less imaginative, less empathetic, less innovative, less human.

At Apple, that is not the world we want to live in.

We believe that privacy is a fundamental human right, one that is essential to our vision of a world where technology enriches people’s lives. And to help create that world, we are fighting for privacy in multiple areas of our work.

The first area is a familiar one. It is our commitment to protecting people from a data industrial complex built on a foundation of surveillance.

At this very moment, companies are mining data about the details of our lives. The shops and restaurants we frequent. The causes we support. The websites we choose to read.

These companies defend their actions as pure of intention, as the work of better serving us with more targeted experiences.

But they don’t believe we should have a real choice in the matter. They don’t believe that they should need our permission to peer so deeply into our personal lives.

Who would stand for such a thing if it were unfolding in the physical world?

Imagine a stranger following you as you take your child to school, holding a camera outside the driver’s side window, recording everything you do. Imagine you open your computer and the stranger is suddenly watching your every keystroke. You wouldn’t call that a service. You would call it an emergency.

In the digital world, it is one too.

So we’ve given our users the features they need to have more control over their private information.

We’ve given them the simple but revolutionary ability to decide for themselves whether apps can track their activity across other companies’ apps and websites.

We’ve given them the tools to shield their locations and hide their email addresses.

And we’ve given them greater peace of mind knowing that apps they download from the App Store are held to our strong privacy standards.

The second area is our battle against an array of dangerous actors — from sophisticated hackers and ransomware gangs to the everyday con artists who pervade our digital world.

We’ve long said that security is the foundation of privacy — because there’s no privacy in a world where your private data can be stolen with impunity.

Never before has this threat been more profound, or its consequences more visible.

From scams and social engineering attacks, to massive data breaches and targeted disinformation, the dangers we face do more than compromise our data. They compromise our freedom to be human. And there is nothing we take more seriously than safeguarding our users from the threat these attacks represent.

That’s why we minimize the amount of data we collect and work to maximize how much is processed directly on people’s devices. Because we know that centralized, readable data is vulnerable data — and we want to reduce the risk to our users.

It’s why personal data on iPhone is encrypted by default, why health data, passwords, and home security camera records that people store on iCloud are end-to-end encrypted, so that not even Apple can look at them. That’s why we continue to stand up for encryption without backdoors — because we know that if you install a backdoor, anyone can use it.

And that’s why we’ve built such rigorous security protections into the App Store from the beginning, so that people can be confident they aren’t downloading malware onto their devices.

But I fear that we could soon lose the ability to provide some of those protections.

And that brings me to our third area of ​​concern: regulations that could put our privacy and security at risk.

To be clear, Apple is in favor of privacy regulation. We have long been supporters of the GDPR and we applaud the many countries that have enacted privacy laws of their own. We also continue to call for a strong comprehensive privacy law in the United States. And we are grateful to all the global leaders who are working to advance privacy rights, including the rights of children in particular.

But we are deeply concerned about regulations that would undermine privacy and security in the service of some other aim.

Here in Washington and elsewhere, policymakers are taking steps, in the name of competition, that would force Apple to let apps onto the iPhone that circumvent the App Store through a process called sideloading.

That means data-hungry companies would be able to avoid our privacy rules, and once again track our users against their will.

It would also potentially give bad actors a way around the comprehensive security protections we’ve put in place, putting them in direct contact with our users. And we have already seen the vulnerability that creates on other companies’ devices.

Early in the pandemic, for example, there were reports of people downloading what appeared to be legitimate COVID tracing apps, only to have their devices infected with ransomware.

But these victims weren’t iPhone users.

Because the scheme directly targeted those who could install apps from websites that lack the App Store’s defenses.

Proponents of these regulations argue that no harm would be done by simply giving people a choice. But taking away a more secure option will leave users with less choice, not more. And when companies decide to leave the App Store because they want to exploit user data, it could put significant pressure on people to engage with alternate app stores. App stores where their privacy and security may not be protected.

Now, I want to make something very clear to all of you: Apple believes in competition. We value its role in driving innovation and pushing us all forward. And we appreciate that supporters of these ideas have good intentions.

But if we are forced to let unvetted apps onto iPhone, the unintended consequences will be profound.

And when we see that, we feel an obligation to speak up — and to ask policymakers to work with us to advance goals that I truly believe we share, without undermining privacy in the process.

We will continue to make our voices heard on this issue.

We will continue to advocate on behalf of our users and what they deserve.

And we hope all of you in the privacy community will join our efforts to make sure that regulations are crafted, interpreted, and implemented in a manner that protects people’s fundamental rights.

Because as much as we all stand to lose in a world without privacy, I also know how much we stand to gain if we get this right.

Today, the promise and potential of technology have never been greater. The innovation landscape across the globe has never been more exciting. And within our sight is a future where technology enables humanity to flourish like never before.

At Apple, we envision a future where technology inspires people to be healthier and more creative, where it opens up new avenues for learning and opportunity, and where it helps all of us connect more deeply with the people we love and the world that surrounds us .

It is a future where technology empowers people — without intruding into their lives — and serves as a unifying force for good.

And it is a future that, together, I believe we have the power to achieve.

As you may know, this year marks the 50th anniversary of the privacy luminary of Alan Westin’s landmark study, “Databanks in a Free Society.”

Westin concluded that while the erosion of privacy was a legitimate fear, it was not an inevitable consequence of technology.

“What is collected, for what purposes, with whom information is shared,” he wrote, “are all matters of policy choice, not technological determinism.” He said that, “Man cannot escape his social or moral responsibilities by murmuring feebly that ‘the machine made me do it.'”

In so many ways, our world today bears little resemblance to the world of a half century ago. But those words strike me as more relevant now than ever before.

This is a pivotal moment in the battle for privacy.

And as we look to the future, it is clear that technology will continue to shape our world.

But the impact that technology makes on society is not predetermined.

The loss of privacy is not inevitable.

And those of us who create technology and make the rules that govern it have a profound responsibility to the people we serve.

Let us embrace that responsibility.

Let us protect our data and secure our digital world.

And let us declare that privacy cannot and will not become a relic of the past.

Thank you so much for having me this morning. Thank you.

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button