Motion capture technology makes digitising dogs much easier

motion capture technology
Photo: Christos Georghiou – 123rf

Researchers from the UK have developed motion capture technology that enables one to digitise a dog without a motion capture suit and using only one camera.

The software could be used for a wide range of purposes, from helping vets diagnose lameness and monitoring recovery of their canine patients, to entertainment applications such as making it easier to put digital representations of dogs into movies and video games.

Motion capture technology is widely used in the entertainment industry, where actors wear a suit dotted with white markers which are then precisely tracked in 3D space by multiple cameras taking images from different angles. Movement data can then be transferred onto a digital character for use in films or computer games.

Similar technology is also used by biomechanics experts to track the movement of elite athletes during training, or to monitor patients’ rehabilitation from injuries. However, these technologies—particularly when applying them to animals—require expensive equipment and dozens of markers to be attached.

Wearing special doggie motion capture suits with markers, computer scientists from CAMERA, the University of Bath’s motion capture research centre, filmed 14 different breeds of dogs doing a range of movements.

They then used these data to create a computer model that can accurately predict and replicate the poses of dogs when they’re filmed without wearing the motion capture suits. This model allows 3D digital information for new dogs—their shape and movement—to be captured without markers and expensive equipment, but instead using a single RGBD camera. 

Whereas normal digital cameras record the red, green and blue (RGB) colour in each pixel in the image, RGBD cameras also record the distance from the camera for each pixel.

The team presented their research at one of the world’s leading AI conferences, the CVPR (Computer Vision and Pattern Recognition) conference on 17 &18 June.


Please enter your comment!
Please enter your name here