Robots Learn to Follow Your Pointing Finger

Brown University system blends speech and gestures to help machines find objects faster

Robots are getting better at reading the room, literally. A team at Brown University, led by graduate student Ivy He, built a planning system that lets robots interpret both spoken instructions and human pointing gestures when searching for objects.

The method combines a vision-language model with a POMDP framework. It treats a pointing gesture as a probability cone that aligns with the eye, elbow, and wrist direction. This approach was revealed at the International Conference on Human-Robot Interaction. It brings robots closer to working naturally with humans.

- Advertisement -
Explore more ..

Scientists 3D-Print Ultra-Hard Carbide for Next-Gen Industrial Cutting Tools

Hiroshima University researchers use hot-wire laser processing to manufacture tough WC–Co cutting materials with less waste

BMW Reveals Next-Gen iX3 With 400 kW Fast Charging and New EV Platform

Built on the Neue Klasse platform, the electric SUV pairs 800-volt charging with BMW’s new software-centric architecture.

Tiny ESP32 Robot Roams Your Desk Like a Curious Pet

Open-source companion bot reacts to touch using simple, low-cost hardware

LiFi Beams 10Gbps Internet Through Windows

pureLiFi’s light-based system aims to bring faster indoor connectivity without drilling or cables
- Advertisement -