How it works is Apple embedded a sensor under the display of the phone. It might seem pretty simple when summarized like that, but in an interview with Bloomberg, Apple reveals just how hard it really was. It wasn’t so much the engineering, which was pretty hard as it is, but rather the hard part was trying to get in the head of their users.
According to Apple’s SVP of software engineer Craig Federighi, “You’re trying to read minds. And yet you have a user who might be using his thumb, his finger, might be emotional at the moment, might be walking, might be laying on the couch. These things don’t affect intent, but they do affect what a sensor sees.”
Federighi also adds that nature’s laws such as gravity also complicated things. “We have to do sensor fusion with accelerometers to cancel out gravity—but when you turn [the device] a different way, we have to subtract out gravity. … Your thumb can read differently to the touch sensor than your finger would. That difference is important to understanding how to interpret the force.”
The report is a pretty insightful look at how Apple thinks and how the feature was created, so if you’d like to learn more you should pop on over to Bloomberg’s for the full interview.