What's it going to take to have touchless gesture control? I'm tired of talking to bots.
I'll just leave this here:
Pair that with Maker API and put some of the controllers around the house and you have gesture control.
Good start, but proximity is kinda limited. I'm thinking along the lines of a cam/rpi that processes broad gestures. For instance, if I'm in the master bedroom with cam on and I swipe my finger across my throat then Sonos mutes.
I am hopeful that the Stoli radar chip in the Pixel 4 brings about radar for the home. A couple of Stoli radar devices in the upper corners of each room in a home/office could detect who is in the room, where they were within the room and the likely activities. When walking into a room the radar system could infer intent, reacting differently to walking into a room as opposed to walking through, a room adjusting lighting before entering a room. Walking through a room could bring up lights enough to navigate, while stopping and moving towards your easy chair would turn on the reading lamp and dim the lights used for navigation.
Oh god, I hope that finger across throat gesture doesn't work its way into AI system vocabulary. It could definitely be misinterpreted when the cyborgs take over.