iPhone LIDAR seems like overkill. Yet it's the future of VR • The Register

2022-06-10 22:43:52 By : Mr. Darcy Yan

Column For the past six months I've been staring at the backside of my iPhone 13 Pro wondering what possessed Apple to build a Light Detection and Ranging (LIDAR) camera into its flagship smartphone.

It's not as though you need a time-of-flight depth camera, sensitive enough to chart the reflection time of individual photons, to create a great portrait. That's like swatting a fly with a flamethrower – fun, but ridiculous overkill. There are more than enough cameras on the back of my mobile to be able to map the depth of a scene – that's how Google does it on its Pixel phones. So what is Apple's intention here? Why go to all this trouble?

The answer lies beyond the iPhone, and points to what comes next.

In the earliest days of virtual reality, thirty years ago, the biggest barrier to entry was compute capacity necessary to render real-time three-dimensional graphics. Back in 1992, systems capable of real-time 3D looked like supercomputers and cost hundreds of thousands of dollars.

Then two British software engineers – one at Canon Research, another at startup RenderMorphics, both raised on the famed BBC Micro and its outré graphics capabilities – created tight, highly performant libraries to do real-time 3D rendering in software. When the first announcement of Canon's Renderware made it onto the usegroup sci.virtual-worlds it was greeted with disbelief and disdain. Someone even quipped that a Canon researcher must have accidentally remained logged in over the weekend, so someone could send off an obviously prank post.

But Renderware was real. Along with Rendermorphics RealityLab (which hundreds of millions use today under its other name: Direct3D) it transformed the entire landscape of real-time 3D graphics. No-one needed a half-million dollar Silicon Graphics workstation for virtual reality anymore – a body blow from which the firm never recovered. Reflecting on SGI's unexpected collapse, one of my colleagues – who'd seen the future coming – delivered a quick eulogy: "Rendering happens," he said, "get used to it."

Yet it took virtual reality twenty years to catch up to the quantum leap in real-time 3D, because virtual reality is more than just drawing pretty pictures at thirty frames a second. It deeply involves the body – head and hand tracking are table stakes for any VR system. Tracking the body thirty years ago required expensive and fiddly sensors moving within a magnetic field. (For that reason, installing VR tracking systems in a building with a lot of metal components – such as a convention center held up by steel beams – was always a nightmare.)

An obvious solution for tracking was to point a camera at a person, then use computer vision techniques to calculate the orientation and position of the various body parts. While that sounds straightforward, computers in the 1990s were about a hundred times too slow to take on that task. Fortunately, by the mid 2010s, Moore's Law gave us computers a thousand times faster – more than enough horsepower to track a body, with plenty left over to run a decent VR simulation.

That's why I found myself in Intel's private demo suite at the 2017 Consumer Electronics Show in Las Vegas, wearing what was effectively a PC strapped to my forehead. This VR system had a pair outward-facing cameras that digested a continuous stream of video data, using it to track the position and orientation of my head as I moved through a virtual world – and through the demo suite. Although not yet quite perfect, the device proved that a PC had more than enough horsepower to enable sourceless, self-contained tracking. I emerged from that demo convinced that I'd seen the next great leap forward in virtual reality, which I summarized in two words: "Tracking happens."

Half a decade later, with multiple trillion-dollar companies working hard on augmented reality spectacles, we're ready to breach the next barrier. Yes, we can render any object in real time, and yes, we can track our heads and hands and bodies. But what about the world? It needs to be seen, interpreted and understood in order to be meaningfully incorporated into augmented reality. Otherwise, the augmented and the real will interpenetrate in ways reminiscent of a bad transporter accident from Star Trek.

For the computer to see the world, it must be able to capture the world. This has always been hard and expensive. It requires supercomputer-class capabilities, and sensors that cost tens of thousands of dollars … Wait a minute. This is sounding oddly familiar, isn't it?

Until just two years ago, LIDAR systems cost hundreds to thousands of dollars. Then Apple added a LIDAR camera to the back of its iPad Pro and iPhone 12 Pro. Suddenly a technology that had been rare and expensive became cheap and almost commonplace. The component cost for LIDAR suddenly dropped by two orders of magnitude – from hundreds of dollars per unit to a few dollars apiece.

Apple needed to do this because the company's much-rumored AR spectacles will necessarily sport several LIDAR cameras, feeding their M1-class SoC with a continuous stream of depth data so that the mixed reality environment managed by the device maps neatly and precisely onto the real world. As far as Apple is concerned, the LIDAR on my iPhone doesn't need to do much beyond drive component costs down for its next generation of hardware devices.

Capturing the real world is essential for augmented reality. We can't augment the real world until we've mapped it. That has always been both difficult and expensive. Today, I can look at the back of my iPhone and hear it whisper words I've long waited to hear: "Capture happens." ®

WWDC Apple opened its 33rd annual Worldwide Developer Conference on Monday with a preview of upcoming hardware and planned changes in its mobile, desktop, and wrist accessory operating systems.

The confab consists primarily of streamed video, as it did in 2020 and 2021, though there is a limited in-person component for the favored few. Apart from the preview of Apple's homegrown Arm-compatible M2 chip – coming next month in a redesigned MacBook Air and 13" MacBook Pro – there was not much meaningful innovation. The M2 Air has a full-size touch ID button, apparently.

Apple's software-oriented enhancements consist mainly of worthy but not particularly thrilling interface and workflow improvements, alongside a handful of useful APIs and personalization capabilities. Company video performers made no mention of Apple's anticipated AR/VR headset.

WWDC Apple this week at its Worldwide Developer Conference delivered software development kits (SDKs) for beta versions of its iOS 16, iPadOS 16, macOS 13, tvOS 16, and watchOS 9 platforms.

For developers sold on seeking permission from Apple to distribute their software and paying a portion of revenue for the privilege, it's a time to celebrate and harken to the message from the mothership.

While the consumer-facing features in the company's various operating systems consist largely of incremental improvements like aesthetic and workflow enhancements, the developer APIs in the underlying code should prove more significant because they will allow programmers to build apps and functions that weren't previously possible. Many of the new capabilities are touched on in Apple's Platforms State of the Union presentation.

Apple will have to redesign its phones to include a USB-C charging port in iPhones it sells into Europe by 2024 after an EU amendment made USB-C the common standard across a range of devices.

In a live press conference, the rapporteur on the issue, Maltese MP Alex Agius Saliba, said: "This is a rule which will apply to everyone. Now it's no more a Memorandum of Understanding and having all the leeway that [Apple] had during the past 10 years – basically to not abide by this MoU – which was abided by the majority of manufacturers. So yes, Apple has to abide."

The single charging solution will be implemented as part of the amended Radio Equipment Directive, nine months after the legislative proposal was first tabled in September 2021.

Apple may ditch its exclusive Lightning port in favor of the more widely used USB-C for future iPhone models.

Apple analyst Ming-Chi Kuo predicted the shift on Wednesday, pointing out that the move would beef up the devices' wired connectivity, and shake up supply chains.

"My latest survey indicates that 2H23 new iPhone will abandon [the] Lightning port and switch to USB-C port. USB-C could improve iPhone's transfer and charging speed in hardware designs, but the final spec details still depend on iOS support," he said.

At the Workshop on Offensive Technologies 2022 (WOOT) on Thursday, security researchers demonstrated how to meddle with AirTags, Apple's coin-sized tracking devices.

Thomas Roth (Leveldown Security), Fabian Freyer (Independent), Matthias Hollick (TU Darmstadt, SEEMOO), and Jiska Classen (TU Darmstadt, SEEMOO) describe their exploration of Apple's tracking tech in a paper [PDF] titled, "AirTag of the Clones: Shenanigans with Liberated Item Finders."

The boffins discuss tools they've developed and released to advance the exploration of AirTag hardware and firmware, made possible by existing imperfections.

On Call Sometimes it just works. Sometimes it just doesn't. And sometimes users do the most curious of things. Welcome to an Apple-tastic episode of On Call.

A California District Court judge has dismissed a proposed class action complaint against Apple for allegedly selling iPhones and iPads containing Arm-based chips with known flaws.

The lawsuit was initially filed on January 8, 2018, six days after The Register revealed the Intel CPU architecture vulnerabilities that would later come to be known as Meltdown and Spectre and would affect Arm and AMD chips, among others, to varying degrees.

Amended in June, 2018 the complaint [PDF] charges that the Arm-based Apple processors in Cupertino's devices at the time suffered from a design defect that exposed sensitive data and that customers "paid more for their iDevices than they were worth because Apple knowingly omitted the defect."

Apple is extending support for its Rosetta 2 x86-64-to-Arm binary translator to Linux VMs running under the forthcoming macOS 13, codenamed Ventura.

The next version of macOS was announced at Apple's World Wide Developer Conference on Monday, and the new release has a number of changes that will be significant to Linux users. The company has disclosed the system requirements for the beta OS, which you can read on the preview page.

One level of Linux relevance is that macOS 13 still supports Intel-based Macs, but only recent ones, made in 2017 and later. So owners of older machines – including the author – will soon be cut off. Some will run Windows on them via Bootcamp, but others will, of course, turn to Linux.

Researchers at the University of California San Diego have shown for the first time that Bluetooth signals each have an individual, trackable, fingerprint.

In a paper presented at the IEEE Security and Privacy Conference last month, the researchers wrote that Bluetooth signals can also be tracked, given the right tools.

However, there are technological and expertise hurdles that a miscreant would have to clear today to track a person through the Bluetooth signals in their devices, they wrote.

Right-to-repair advocates are applauding the passage of New York's Digital Fair Repair Act, which state assembly members approved Friday in a 145–1 vote.

The law bill, previously green-lit by the state senate in a 49-14 vote, now awaits the expected signature of New York Governor Kathy Hochul (D).

Assuming the New York bill becomes law as anticipated, it will be the first US state legislation to address the repairability of electronic devices. A week ago, a similar right-to-repair bill died in California due to industry lobbying.

The Register - Independent news and views for the tech community. Part of Situation Publishing

Biting the hand that feeds IT © 1998–2022