Tag Archives: Brian X. Chen

Apple Demonstrates How Not To Support Self-Repair

Sigh. Don’t try this at home.

Apple has long been famous for its hostility to DIY.  From the beginning, long before the iPhone, Apple made it as hard as possible to develop your own software.  The so-called geniuses basically worked hard to prohibit anyone from changing or adding to—or competing with—their perfect product.  Sigh.

With 20 years of experience with lap tops and servers, the iPhone was, from the start, one of the most closed systems ever released.  You can’t even install your own software on your own phone without approval from Apple.  And you definitely can’t open up your phone, even to, say, change the battery. 

For a software developer, this basically means that I can’t really do much on Apple devices, at least not without negotiating with Apple.  Which, no, I’m not interested in getting permission from Apple to create my own stuff.

This isn’t just Apple, everybody is doing it, and it’s not funny anymore. Customers are starting to militate for more control over our own devices.  And some jurisdictions are starting to legally mandate the “right-to-repair”.  This is far short of a really open platform, but it’s a step.

Now, Apple is really, really good at design.  So they could design self-repair capabilities that would blow you socks off.  But instead, as Brian X. Chen reports, Apple seems to have bent their efforts to a passive aggressive, absolutely hostile process [1]. 

It would be really funny if it wasn’t so annoying.

Apparently, the “self-repair” option involves renting a room full of the specialized machinery that Apple shops use, so that you can attempt to follow the preposterously designed processes used by official Apple repair elves. 

(One thing we learn from this “self-repair” system is that Apple makes it difficult even for their own employees to repair things.)

Chen’s article walks through his own disaster.  All he was trying to do is replace the battery.  On normal hardware, this is considered a routine user task.  But iPhones are “special”.  Very, very, special.

Among other things, you have to heat the phone to melt the glue that seals everything.  And, of course, there are special screws that have to be removed in the right order.

Naturally, he broke his phone.  Fortunately, he had expert help to replace the screen he broke.

When the phone eventually rebooted, it would not run because it detected “unknown” parts.  Now, these were all official Apple parts, they just weren’t installed by Apple.  But you have to contact Apple and jump through hoops to get your repaired phone re-authorized by Apple.

And, by the way, all this costs more than paying Apple to do it for you, even if you don’t buy the whole repair factory kit thing for thousands of dollars.

So, fix it yourself, if you dare!  Honestly, I wouldn’t recommend any normal person try this on their own phone, at least not one you hope to keep using after you “fix” it.

It’s very clear that the design geniuses at Apple worked very hard to design systems that cannot be fiddled with.  I understand why they do this.  They want to prevent reverse engineering and hacking, and protect their “special” experience.   But this obsession with protecting Apple’s property rights has seriously bad side effects for users.

The current “self-repair” process is basically, “You rent an Apple shop, and learn to be an Apple repair person.  And then register your repairs with Apple.” 

This is a horrible user experience, and only just barely works at all.

Tsk.  I know you can do so much better if you wanted to, Apple.


  1. Brian X. Chen, I Tried Apple’s Self-Repair Program With My iPhone. Disaster Ensued, in New York Times. 2022: New York. https://www.nytimes.com/2022/05/25/technology/personaltech/apple-repair-program-iphone.html

Apple Helps Evolve the NSA Narrative

Quite an interesting episode of the ongoing soap opera surrounding “privacy” in the age of ubiquitous internet connected devices.  (It’s been quite a while since I blogged about the NSA’s Narrative:  “we are watching you”. )

Apple’s otherwise horrible release of iOS 8, they tout their privacy features, most of which make me say “why wasn’t that done before?”  (Android will soon follow with the same kind of deployment.)I don’t want to be negative:  for their own self-preservation Apple has done a really good job of paying attention.  All the most obvious stuff is covered. (For more details see the Apple white paper.)

This is better than before, but it would be a mistake to believe that the system is secure.  I mean, it’s a little computer in your pocket connected (you never know exactly how) to the Internet.  And despite Apples highly authoritative attitudes about controlling apps and third parties, the fact is you have to be really careful what you do.

The biggest interest was their splashy announcement that they “cannot access” your personal data, and therefore “it’s not technically feasible for us to respond to government warrants“.  As far as I can tell (and I’m no expert here), basically they encrypt the data with strong encryption and have no ‘back door’ or master key to let them or anyone break the crypto.  In other words, they have implemented actual encryption, rather than the fake encryption popular in the past.  What an amazing innovation!

Of course, this “innovation” is rather “disruptive” of one old-line industry, the police-national security sector.  Law enforcement has been very happy with the fact that people voluntarily carry around these highly capable data collection devices, which the police can use to identify and locate individuals of interest and amass dossiers about recent activities of many kinds–movements, contacts, transactions, and contraband.

The use of stronger encryption means that some of this information will be harder to get, and certainly will take a lot more effort and time, if the police have the resources to do it at all.  From the position of local police, Apple has resigned as an unofficial deputy for the PD.

The national security folks have the resources to attack these problems, but even they will have to work at it.  The NSA can no doubt crack a phone if needed, but  life was so easy when the devices were easy to access!  And the rest of the system (the networks, the connections metadata, the cloud storage, etc.) are still accessible, just not your pix on your handheld.

For me, the interesting part has been the theater surrounding these fairly obvious technical matters.

Apple has put this forward with a splashy slap in the face of US government and police forces.  This is widely recognized as a long anticipated reaction to the Snowden affair.  (If so, he deserves a medal for instigating computer security improvements.) In order to sell phones all around the world, Apple has put forward a narrative about “the NSA is watching you”, but “Apple is on your side”.

The US government helped along the narrative with condemnations from FBI director Comey, pointing out both the policy implications (there may be times you want the police access data) and the sheer arrogance of Apple’s FU to the US government–when there are lot’s of bad guys out there.

The FBI was joined by local police chiefs (who surely will be inconvenienced).

All the jawing by the FBI and police has catapulted an otherwise obscure software update into the world media spotlight.  The US government is seen to cry, “Oh woe, Apple is screwing us.  We can’t spy on you any more.  This is terrible.”   Apple is see to offer a heroic, ground breaking product that is magically “secure” from the US government.

This is all a very subtle evolution of the NSA Narrative:  “we are watching you.”   If you follow this line (and the upcoming Google upgrades), you are playing into their plans:  use the (American made) Apple and Google “magic” and you will be safe.  You don’t need to fear the NSA anymore, just use a long passcode and everything will be fine.

(And by the way, NSA and FBI are certainly happy if these changes make life harder for Chinese and Russian hackers.)

What I’m saying is that this is nothing more than a tiny inconvenience to the NSA (though quite effective against teenagers and local police), but they have exploited it to increase public awareness of cyberdefence and also to make sure that bad guys know that they are being watched.  The new wrinkle is the implication that using the next releases of Apple and Google will “protect” you–false confidence can be more dangerous than global paranoia.

One last comment:  aside from the kind of unfair slap as US government (what about China, Russia, and all the rest?), Apple’s narrative slapped rival companies, and basically said “trust us”.  It was interesting to see Apple slap Google’s ubiquitous user tracking, with a claim that Apple would never do something like that.  On the same page, we see Apple’s financial, home, and health tracking stuff–hugely invasive forays into privacy.

Who will protect us from Apple (or Google or Amazon or Facebook or the rest)?  “Trust us”, and anti government rhetoric isn’t really enough.