What does the iPhone X mean for mobile banking?

3rd November 2017
by:

It’s been 10 years since the first iPhone came out — and now there’s a lot of hype about the new iPhone X, released today. One of our iOS developers Kris was lucky enough to get his hands on the phone; he tells us of his experience and how he thinks it’ll impact mobile banking.

Hi Kris, tell us a bit about your background and what you do at Starling.

I started as an ad designer in the late 90s for one of Slovakia’s national newspapers. In the early 2000s, I did some PC assembly and network administration and support for clients both on-site and remotely. The experience of supporting networks across the country and some of the difficulties I faced while doing that led me to eventually set up my own software startup. I wanted to change and substantially improve the user experience of how we, at the time, deployed Windows on a larger scale. I think we succeeded; we made the tedious manual process fully automated and hassle free. As the founder of a small shop, I had to do everything from UI concepts, to marketing and legal. The only thing I was not capable of doing was programming, which felt quite frustrating as I really wanted to partake in the creation of the product itself. That’s when I decided it was time to fill this gap and I started to learn C-sharp.

When the iPhone came out, I realised the desktop paradigm was pretty much dead and there was a mobile revolution in progress. For years I was quite frustrated with Microsoft’s approach to software design and quality. I’ve always had strong opinions on what constitutes a good design and user experience, and switching to Apple’s iOS seemed like the logical and inevitable step. I wanted to write my own apps and deliver really good UX to customers. To this day, this is the most fulfilling part of my daily life as a software engineer.

Moving to London, I had a number of programming jobs where I progressively improved my programming skills. Eventually I joined Starling where, besides my core competency of programming, I am able (and expected) to influence how the product looks and works.

When it to comes to Starling, the only thing customers see is the app and there is no fallback media or channel to use, so good UX is of critical importance. In some ways, I consider design superior to programming, which is a fascinating world on its own with its never ending architectural considerations and sometimes strange abstractions.

My ultimate aim is to make the design team happy by delivering exactly what they asked for. Not doing so is often the source of tension between development and design teams and diminishes the value they create by meticulously obsessing over every pixel. So my experience helps me better interpret their specifications and expectations. I also try to give them detailed feedback about technological options and platform limitations so they better understand what is possible. Last but not least, I interact with my fellow iOS dev colleagues. This way I’d like to thank them for tolerating my occasional rants about UX and implicitly unwrapped optionals.

You’ve experienced the iPhone X first-hand — talk us through some of the major changes that stand out to you.

Let’s start with the most obvious change: the screen. One can immediately see the difference as it is not rectangular. Besides the new shape, it is now also using OLED display so it can finally show a perfect black color. It is a true 3x Super Retina HD display, which means contrary to the Plus designated iPhones, it does not downscale the screen resolution from internal 3x retina to a smaller Full HD before the device renders the image on screen — everything is super sharp. It can now display a wider spectrum of colours reaching higher saturation.

The experience was, in a way, shocking. It felt like I was not even looking at screen (i.e. images under glass), but instead what I saw was the raw, physical material itself in its native color.

The home button (and Touch ID with it) has been removed, so now all the interactions are done with gestures, which will take some getting used to. Most probably not more than 1-2 weeks though.

Another important change is the CPU. It now has a separate dedicated section called Neural Engine. The job of this chip is to analyse the signal coming from TrueDepth camera and illumination system, which is the controversial notch at the top of the phone. Some people might remember MS Kinect; Apple bought the original company behind Kinect and this is basically a much improved miniaturised version in a phone.

Neural engine analyses the signals coming from TrueDepth camera to create a 3D map of your face, which is then used as an authentication mechanism called FaceID. Using neural network algorithms directly on the device and using very fast, dedicated hardware without having to send anything to the cloud helps maintain your privacy while doing do this at very high speed, independently of network connection speeds, which on mobile varies greatly. If it’s you, iOS will simply unlock itself. But we’ll talk about FaceID a bit later in more detail.

The last thing, kind of an honorary mention, is the glass back. Besides being an obvious aesthetic change, it allows electromagnetic fields to reach the induction coil inside the phone, which is the basis of wireless charging.

What do you think is the biggest impact on mobile banking with this new release?

FaceID might seem like the obvious low hanging fruit, but the paradox here is that given its implementation, the impact of it on mobile banking will be very little.

The neural engine with the TrueDepth camera system and, on top of that, the specialized software layer framework that handled TouchID in the past (but now it hides FaceID) will do all the heavy lifting for you.  

The challenge for us at Starling was not some kind of rewrite to support new hardware. The only new thing we needed to add was identifying that the device is an iPhoneX and adjust the copy on some screens. A slight complication is you can now turn FaceID off in the phone’s settings globally. This is new in iOS 11 and it affects some of our user flows. Otherwise, it supports our app’s security model just fine.

There is a new API in iOS 11 called CoreML, which uses machine learning techniques to identify content in images. Another new API called ARKit uses 3D data from TrueDepth camera (to put, for example, a pumpkin on your head and move it in perfect sync as you look around, which is the latest craze with the likes of Facebook Messenger, etc.) But functions of the neural engine are not exposed directly via APIs.

Here’s an example: I still use my Slovakian business bank account. I have a premium business account, which means I can call my personal banker directly from the mobile banking app. In the future, if we would some have some kind of more open API from the neural engine, I can imagine customer service could talk to a person, and the bank software in the background could do the facial recognition and communicate with the bank’s APIs. We would be able to get away from those silly questions, like mother’s maiden name, or that one secret answer you can never remember. The neural engine might, perhaps, identify scanned checks which are still in use in the UK as valid. Who knows where else this could go in the future — there could be a whole identity framework built around this and that’s very powerful.

Apple tends to keep core hardware technologies inaccessible to software third parties and only exposes them after a few years. This was the case with Siri and NFC as well.

Do you think FaceID will become the norm?

Definitely. It’s an invisible system — very convenient and effortless — it just works. Apple seem to have a core technology hit once in awhile. They really take their time, and only when it satisfies their high UX standard do they provide it to customers. The funny part is these novel approaches are always questioned by pundits, but one year later, everyone is doing it.

64bit CPU was a novel new idea in mobiles three or four years ago, and one or two years later everyone had it. TouchID, same story. Others had sensors you had to swipe and those solutions had 60-70% accuracy at best, but Apple were the first to implement it such that it feels like it cannot be any other way.

With FaceID, think back to some sci-fi films set in 2024 (not long away now) where the character had to put their eye —  it was always one eye — into some scanner on a wall, which then scanned back and forth and made funny whoosh-whoosh sounds. But that’s not actually good design.

When I tested FaceID at Apple myself, it just worked. It was very impressive, it looks like they’ve again nailed a core technology improvement. It even feels more seamless than TouchID because the mechanism doesn’t need you to do absolutely anything. Contrary to TouchID where a deliberate action with your finger is necessary, FaceID simply tracks your eye movement and knows whether you are looking on the phone or not, then acts accordingly.

Are there any other impacts specifically for Starling?

Security is very important to us at Starling, so it was a priority to ensure we tested this new authentication feature. We’ve tested all the user flows with the new hardware and we were able to make some necessary changes, ready from day one release of the phone. Everything must remain straightforward for our customers. Other than that, I don’t think so.

What are the implications for designing and building for iPhone X — will it be a lot more work?

This actually relates to your previous question about impact. Strictly speaking, we do not have to do a lot of additional work specific to iPhoneX. We have to ensure the app’s content reaches those special non-rectangular areas only in very specific cases. However, I see the new shape as a second opportunity for designers to rethink how we use bigger real estate of larger phones like Plus and X. 

At the moment, iOS app designs in the industry tend to be optimised to the smaller 4.7 inch phone, not the 5.5 inch form factor. When you have a menu of five icons, how do you enlarge for a bigger screen? You create more space in between, or perhaps add a sixth one? I’m not saying we should do that, but we do not even try to consider some specific behaviours. It did not work out with the Plus 5.5 inch phones; designers tend to keep the more custom element sizes the same and just add more white space. I question that approach.

Banks are treasure troves of insights about customers. Let’s take this opportunity of having new larger screen real estate, and give customers a more holistic, multidimensional overview of their financial life — using visualisations like in one of my favourite books, Information is Beautiful by David McCandless.

When it came to testing, how did you do that?

To support development on iPhone X, we needed a new version of Xcode and a simulator that represented a virtual iPhoneX. But we could not simulate FaceID states until version of Xcode 9.1, which was only recently released. We still did not want to risk relying purely on a simulator, as there can be subtle differences when it comes to TouchID and FaceID, which we were not sure about at the time. We needed to ensure our user flows stayed seamless, so we visited Apple. The only reliable way to make sure your app works is to try it on an actual device.

We initially wanted to see what the implications were from a programming perspective and then validate the solution from a product perspective, so the product wouldn’t be compromised in any way.

Do you think banks can ever make use of the augmented reality feature?

Augmented reality means you are looking at your screen and whatever is picked up by the back camera is fed to your display in real-time; the augmented part here means the device superimposes additional content into the camera feed and these objects maintain proper position, size and other properties, as if they were part of real world behind the camera.

Augmented Reality

Why would a bank need this? Wouldn’t it be just a gimmick? At the moment we are doing the opposite — banks are getting rid of branches and physical presence. The core banking proposition probably doesn’t need this, but Starling has a Marketplace, so third parties dealing with more niche offerings might use this to enrich your banking experience. For example, pointing at product in a store might allow for some analytical knowledge (better price, or a finely tuned loan with a specific merchant) that you could provide only if you are a bank or when you are integrated with a bank.  

What’s the most exciting thing about the changes?

This device is amazing and there’s so much new technology packed together. The bar got higher again, and I hope it will translate into the quality of apps, too. But there’s no guarantee things will evolve into some new design approach in one month. We have to wait and see how the design and programmer community will embrace it, and provide better value for their customers.

So in one sentence, what does iPhone X mean for you working in mobile banking?

Besides having to test on just another device, it’s a solid opportunity to rethink or improve ways to present rich sets of information, which I just realized is literally to allow our customers “see their money in a new way”.

Next

The Side Hustle: Arati Devasher