iOS works works hand-in-hand with iTunes to backup the contents of an iPhone / iPad / iPod. By connecting an iOS device to the USB port of a PC or Mac running iTunes, it is possible to make a (nearly) complete backup of the files stored on that device. I say nearly because Apple wisely prevents backing up passwords and private health-related data unless the archive is encrypted.
Upon connecting an iOS device to a PC to which the device has not been previously synced, both the device and iTunes will require confirmation that you do in fact want each to trust the other.
That's a great idea: "trust" implies a lot. A trusted computer is allowed to copy information to and from the iOS device - uploading pictures, downloading music, backing up apps, syncing contacts.
What is missing from this picture though?
Enabling the iOS device to trust a new computer is a one-click operation - never was I asked to provide credentials to approve the trust relationship. As long as the iOS device is logged in and not screen locked, one click is enough to tell the iPhone or iPad that this computer can be trusted.
Is this a big deal?
Not really, but I can think of a fairly common scenario in which this model could be abused.
Have you ever lent your phone to a friend so they could make a brief phone call?
A decade or so ago, a cell phone did little more than make phone calls. If I borrowed a friend's phone, I might be able to make an unexpected toll call, or perhaps swipe a private phone number from their contacts, but that was about it. Modern smartphones though are treasure troves of private information - photos, email, SMS history, browser history, cached login information, apps with direct access to financial accounts - the list goes on.
If I borrow your iPhone under the guise of making a phone call, in 5 to 10 minutes I can access some data on the phone but am limited in how deeply I can search.
However, in 5 to 10 minutes I can USB tether to my computer, trust it, and make a full device backup which I can search at length later. Or in just a few seconds I can establish that device trust now, and later slip it off your desk to make a backup of the locked iPhone.
A simple mitigating feature would be to require re-authenticating via password, PIN, or fingerprint before trusting a new computer. Trusting a new computer does not occur frequently - in fact, with any given device, you likely only do it once or twice over the lifetime of that device - so the extra step is very little inconvenience while providing an extra degree of privacy protection.
For a company bent on improving its security reputation, this would be an easy win. You have the option of requiring the PIN or password to make purchases in the App and iTunes Stores; why not have the same option before allowing your entire digital life to be copied off the device?
In the grand scheme of things, the ability to make a covert backup of another's iPhone isn't at the top of my list of worries. It requires physical access to an unlocked device, meaning I'd have to unlock my phone and let someone borrow it - not something I'm likely to do for someone I don't know and trust.
Still, it pays to understand how your trust can be abused. Keep this in mind the next time a friend asks "can I use your iPhone to make a call?"
What is missing from this picture though?
Enabling the iOS device to trust a new computer is a one-click operation - never was I asked to provide credentials to approve the trust relationship. As long as the iOS device is logged in and not screen locked, one click is enough to tell the iPhone or iPad that this computer can be trusted.
Is this a big deal?
Not really, but I can think of a fairly common scenario in which this model could be abused.
Have you ever lent your phone to a friend so they could make a brief phone call?
A decade or so ago, a cell phone did little more than make phone calls. If I borrowed a friend's phone, I might be able to make an unexpected toll call, or perhaps swipe a private phone number from their contacts, but that was about it. Modern smartphones though are treasure troves of private information - photos, email, SMS history, browser history, cached login information, apps with direct access to financial accounts - the list goes on.
If I borrow your iPhone under the guise of making a phone call, in 5 to 10 minutes I can access some data on the phone but am limited in how deeply I can search.
However, in 5 to 10 minutes I can USB tether to my computer, trust it, and make a full device backup which I can search at length later. Or in just a few seconds I can establish that device trust now, and later slip it off your desk to make a backup of the locked iPhone.
A simple mitigating feature would be to require re-authenticating via password, PIN, or fingerprint before trusting a new computer. Trusting a new computer does not occur frequently - in fact, with any given device, you likely only do it once or twice over the lifetime of that device - so the extra step is very little inconvenience while providing an extra degree of privacy protection.
For a company bent on improving its security reputation, this would be an easy win. You have the option of requiring the PIN or password to make purchases in the App and iTunes Stores; why not have the same option before allowing your entire digital life to be copied off the device?
In the grand scheme of things, the ability to make a covert backup of another's iPhone isn't at the top of my list of worries. It requires physical access to an unlocked device, meaning I'd have to unlock my phone and let someone borrow it - not something I'm likely to do for someone I don't know and trust.
Still, it pays to understand how your trust can be abused. Keep this in mind the next time a friend asks "can I use your iPhone to make a call?"