This could be awesome. I think a big problem with web meetings is that you can't look someone in the eye and simultaneously look like you are (because you'd have to look at your camera).
I'll be curious how this works on the fringes. What if she turned her head to a 30% angle? It's effectively a live deep fake, and you see where those start get jittery or buggy on occasion. It could make everyone look they are in a bad Matrix knockoff.
It's not perfect in the demo and there will sure be edge cases but this is a great problem to be solved using ML. Video Conferencing tools would definitely integrate this and I think that's what Nvidia is hoping for.
Definitely agree. I hope this sees through to being a really effective feature.
I suspect the subtly of eye contact is under appreciated when talking about the effectiveness of online meetings versus in person. Whatever can be done to make online meetings higher fidelity will only help.
Here's what I do: look at the camera instead of the screen. I bet the other person feels I'm paying them more attention than most people over video.
They sure would, but there are genuine circumstances where one needs to read something from screen and this would help then.