This story is part of it, our full coverage of the latest Apple news.
If Apple made one thing clear on Tuesday, is that the cameras and processors of these new phones will be a big problem. Apple’s key note even included a short film by Oscar-winning director Kathryn Bigelow and cinematographer Greig Fraser that was shot at the show off .
It is clear that Apple is marketing theto photographers and videographers. But the advances make me think about how Apple could pave the way for something bigger. You may be preparing the groundwork for what is expected to be Apple’s next big thing: an .
Updates such as better cameras, more powerful processors, and additional storage options are typical of new phones. But it’s the way these additions come together with other iPhone updates from the past two years that suggest Apple is setting up the iPhone to make it an augmented reality hub.
The iPhone may one day be the brains of Apple’s rumored virtual reality headsets
For years there have been rumors that Apple could work on smart glasses that provide virtual and augmented reality experiences. But unlike most of Apple’s products, which are usually leaked in detail long before their release, the reports have offered a mixed perspective so far.
Bloomberg reported last January that Apple is developing AR and VR headsets all in one aimed at developers who work with their most powerful chips. These headphones would serve as a precursor to a larger pair of sleek RA glasses, according to the report.
But a recent story from The Information, published a few days before the iPhone event, describes something completely different. It suggests that Apple headphones will work with a less powerful chip and therefore need to be connected to a host device such as the iPhone.
If the information report turns out to be correct,it certainly looks like a capable host for this type of portable device. Apple is calling the version of its A15 Bionic chip found on the iPhone 13 Pro, which has five graphics processing cores instead of the normal iPhone 13, the fastest chip ever made in a smartphone. Apple is launching this phone into photo and video editors, but it’s best for the best graphics to also outperform virtual reality and virtual reality apps.
Aside from processing power, all Apple iPhones also increase battery life and have more storage space, making it the first iPhone to get a. Again, these updates would probably be needed if RA apps became more popular, and certainly Apple they will be. This makes me think that Apple could protect the future of these iPhones in a scenario where we all use virtual reality or VR apps on our phones almost daily. Or when the rumored Apple headphones exist.
Gradually, the iPhone is better equipped for RA
These updates alone do not suggest anything significant about Apple’s ambitions for future products. But clearly, Apple makes the iPhone much better at powering upwhich are meant to live on the phone, as we have seen in recent years.
Apple is positioning the– Including a new cinematic mode that automatically shifts focus between topics – ideal for media professionals. And Apple is probably right in assuming that this is the most significant way to see these sleek cameras in the short term. But I can imagine having cameras capable of focusing on issues more quickly and accurately would also be very useful for RA apps, even though Apple didn’t focus on RA during its event.
In addition to updating the iPhone’s cameras, Apple has equipped its devices with sensors that give them a better sense of their environment. This is key for a technology like RA that needs to accurately detect real-world objects in order to function.
The biggest clue came last year when Apple added a lidar scanner to the, a sensor that detects depth by measuring the time it takes light to reflect from an object. Apple hasn’t been subtle about how lidar can enhance augmented reality on the iPhone; highlighted the RA as a key reason to put the iPhone in the first place. I can imagine that the upgraded cameras on the iPhone 13 Pro combined with lidar could allow it to run some powerful RA applications.
A year earlier, Apple also first put an ultra-bandwidth chip on the iPhone. The iPhone 11 introduced Apple’s U1 chip, which allows for much more accurate location tracking when used indoors compared to GPS. Right now, the iPhone’s ultra-bandwidth technology is primarily used to improve AirDrop and to find lost items through.
Still, it’s another example of how the iPhone is becoming spatially aware and may have a lot of potential for future RA applications. AirTags already provides an early indication of how this technology could be used in RA applications.
A feature called Precision Search, for example, shows directions on the iPhone screen that will lead you to the lost AirTag tag. It’s easy to imagine how it could translate into future RA apps that superimposed addresses on the real world rather than on your iPhone’s screen.
And if that were not enough, Apple’s next iOS 15 also comes loaded with similar features, as my colleague Scott Stein points out.
Did Apple provide the iPhone 13 with a better camera, more processing power, and longer battery life just for virtual and augmented reality applications? No. These updates are useful for all smartphone users, even those who don’t shoot movies on their iPhone and who primarily use the device to take pet photos and read the news. But, if you consider these updates in the context of the evolution of the iPhone over the years, there certainly seems to be many more possibilities.