Viewing a single comment thread. View all comments

NeverComments t1_iz02vjl wrote

>the apps device will probably be twice the price or more with better specs

If the specs we've heard are accurate it's looking to closer to ten times the price. It's primarily an AR/MR device so we're looking at an extremely high resolution screen, pancake lenses, eye tracking, a handful of high-res cameras with LiDAR companions all powered by an M-series chip. The closest competitor is Meta's Quest Pro which has a lower resolution panel, one color camera, no LiDAR and comes in at a $1.5k MSRP. An optimistic price point for Apple's headset would be $2.5k but I think it'll end up being a $2999 MSRP (intentionally pricing out Average Joe for this first iteration).

4

Heliosvector t1_iz03pu3 wrote

What does it need lidar for??

1

Bobbyanalogpdx t1_iz04hcx wrote

LiDAR would help place objects more accurately in an AR/MR setting

3

NeverComments t1_iz04sea wrote

Depth sensing is used to properly spatialize digital content for AR. You can try and parse depth information using raw camera imagery and ML but it's...not great. With the Quest Pro you need to manually tell the headset where your walls are while Apple's ARKit can use LiDAR to automatically map out your floor plan.

2

Heliosvector t1_iz06ptb wrote

If it similar tech to what they have been using for face unlock on phones?

1

NeverComments t1_iz0c7zm wrote

They're conceptually similar but measuring different things. The dot projector in FaceID acts as a sort of guide. The dots are projected in a grid and you can use the distortion of the dots on the projected surface to interpret the shape of the user's face. The LiDAR sensors measure time of flight which allows it to determine the specific distance of objects relative to the sensor. The sensor used for Face ID can tell you that it's detected an object but LiDAR can tell you exactly how far away it is. That property makes LiDAR extremely useful for AR where you need to know how far away a given surface is in order to render something at the appropriate size with the correct perspective distortion applied.

1

allinbbbyfortendies t1_iz24vgk wrote

Close range Room mapping, I actually bought an iPhone simply because they are cheaper than the similar units that were available at the time.

Units that had no hardware other than the lidar, not even computation to parse the inputs, they were thousands of dollars.(I haven't looked recently)

Anyway I bought just the apple unit as they were obscenely cheap when just the replacement part, but for the life of me I could not reverse engineer any usable data from the device

I ended up embedding an entire iPhone into the robot I was working on. It was literally thousands cheaper than buying one otherwise

1