In March, the organizers of a computer-security conference called CanSecWest challenged attendees to break into any one of five smart phones, among them Apple's popular iPhone. The perceived difficulty of the task--especially breaking into the iPhone--meant that few researchers made any attempt to hack the devices, and none succeeded.
Now two researchers hope to make things considerably easier for would-be iPhone hackers. Next month, Charles Miller, a principal analyst at Independent Security Evaluators, and Vincenzo Iozzo, a student at the University of Milan, in Italy, will present a way to run nonapproved code on Apple's mobile device at the Black Hat Security Conference, in Las Vegas.
Researchers have previously found vulnerabilities in the security of the iPhone; Apple disclosed and issued a patch for a dozen such security holes in the device last November. But it remains tricky to run a nonapproved program once such a flaw has been exploited. Because of the difficulty in running unauthorized code on the iPhone, many security researchers simply refuse to spend much time finding any flaws.
"If you want to attack iPhones, you have to be able to run code to do whatever it is you want to do," Miller says. "Maybe that is grabbing credentials, maybe it is listening into phone calls, maybe it is turning on the microphone. Who knows? But this all requires that you be able to run code."
"Charlie found those particular places where changing permissions is allowed on the factory iPhones," says Sergio Alvarez, a security consultant with Recurity Labs and a fellow iPhone hacker, who is familiar with Miller and Iozzo's research. "[These parts of the phone] make our lives easier and give us more freedom to code generic and reliable second-stage [attacks]."
The challenge for security researchers and malicious attackers is that Apple restricts the data that can be executed in the iPhone's memory and requires that programs for the iPhone be cryptographically signed by Apple. Code signing has security benefits, but it is also a way to control which applications run on the iPhone platform.
"In iPhone 1.0, there was very little security built into it," Miller says. "But when they went to iPhone 2.0--less because they cared about people breaking into phones and more because they wanted to make sure that they wanted to have the App Store and not have people download all sorts of crazy apps--they added a bunch of security."
But Miller found more than one instance in which Apple failed to prevent unauthorized data from executing. This means that a program can be loaded into memory as a nonexecutable block of data, after which the attacker can essentially flip a programmatic switch and make the data executable.
The ability to run any code is significantly different from "jailbreaking" a phone, a term used when the owner of a phone breaks the security locking that device to a particular provider or operating system, because it requires physical access to the device, Miller says. "Jailbreaking is, you have your own phone, you have it in your hand, and you want to do something to make sure you can put nonsigned code on it," he says. "You own the device, so you can do certain things to it."
In fact, at the CanSecWest Conference in March, Miller, Alvarez, and other researchers realized that attacks that work on jailbroken phones would not work on regular (non-jailbroken) iPhones. They had assumed that the attacks they had found on a jailbroken iPhone would work on nonbroken devices. Instead, they found that their attacks would not work.
"Basically, what happened was that everybody made the same mistake, and we all have learned from it," Recurity's Alvarez says. "We used jailbroken iPhones in order to be able to debug."
While the researchers could not come up with any legitimate uses for running unapproved code on the iPhone, Miller stresses that the research is valuable. Like nearly 40 million other people, he carries an iPhone containing work information, personal details, and family pictures. Knowing the limits of the device's security is important, he argues.
"The thing is, I'm pointing out exactly what bad guys can do against the device," he says. "They are likely doing parallel research, except they don't share their results. It is better for everyone to understand the strengths and weaknesses of the security of devices, and make informed decisions about what devices they should use and how they should use them, rather than having only the bad guys know how they work."
Of course, Apple may have already fixed the issue. Later this month, the company will release version 3.0 of the iPhone operating system, and Miller will have to make sure his attack still works.
"With iPhone 3.0 coming out, that might change a lot of this stuff," Miller says.
No comments:
Post a Comment