Researchers have developed a way to detect and monitor human body language.

The code for the new technology, called OpenPose, has been released on Github.

Its developers say it can be used to create a new way to communicate with machines in addition to text and auditory communication.

It will also allow robots to better perceive what people are doing, their moods, and how to interact with them.

The system was developed using an inward-looking dome of 500 video cameras and motion sensor systems.

Specific attention was paid to hand movements, which were captured from every angle on 31 high definition cameras.

OpenPose can interpret individual body poses among a group of people, and can detect hand movements down to the arrangement of a person's individual fingers.

The researchers say it will allow a new level of gestural interaction with computers, but the potential use in AI-based security monitoring is sure to be concerning to some.

One particularly useful purpose could be for self-driving cars. OpenPose could enable a car to monitor and detect if someone is about to walk into the street an potentially avoid hitting them.

OpenPose could also help in behavioural diagnosis of autism, dyslexia, and depression, and assist in rehabilitation too.