Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. The machines used to render that demo were, in the author's own words, not the full Magic Leap hardware.

edit: Also, even if that demo was the advertised Magic Leap hardware, it still only responded to camera movement, and Miller said the demo had capabilities that he refused to actually display.



[flagged]


"The level of detail was impressive. I wouldn't mistake her for a real person"

"I noticed that when I moved or looked around, her eyes tracked mine. The cameras inside the Lightwear were feeding her data so she could maintain eye contact." Yes it is possible that the demo changes behavior based on eye movement alone but that's not what the author said.

Lightwear is the headset component. It is not functional without another, separate computer. The author says he only clearly saw the full, multi-piece ensemble of the advertised prototype Magic Leap hardware later, in a different room from the demos. My information comes from the literal words in and structure of the article. To make the point you are trying to make, one must add words and meaning that are not in the article. Continue insulting my literacy.

edit: You are right about one thing; my assertion that the demo in question was reliant on -camera motion- may not be correct. Eye-tracking on commodity hardware using a single camera has been a solved problem for years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: