Automated 3D computer vision model offers a new tool to measure and understand dairy cow behavior and welfare

by

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

A Journal of Dairy Science study has taken a step toward validating a 3D pose estimation method for monitoring the ease with which cows can get up and down in their cubicles, offering a new assessment tool to improve overall comfort and well-being of dairy cows. Credit: Adrien Kroese

Dairy cows typically rest for 10 or more hours a day, so a dry, clean, and comfortable place—such as a freestall—to lie down and rest is essential for their health, well-being, and production performance. One key factor in whether stalls are comfortable for cows is the ease with which they can get up and down, so it is common on farms for staff to watch for abnormal rising behaviors as part of standard welfare management.

In a new study in the Journal of Dairy Science, a Swedish team, in collaboration with Sony Nordic, introduced a new automated model that accurately detects posture transitions in dairy cows. This innovative approach using 3-dimensional (3D) pose estimation offers valuable, unbiased insights into animal welfare and could offer a less time-consuming and more consistent assessment tool for researchers and farmers alike.

Led by Niclas Högberg, DVM, and Adrien Kroese, Eng, Department of Clinical Sciences, Faculty of Veterinary Medicine and Animal Science, Swedish University of Agricultural Sciences, Uppsala, Sweden, the study aimed to develop a reliable method for monitoring the ease with which cows can get up and down in their cubicles, a crucial indicator of overall comfort and well-being.

Adrien Kroese explained, "Evidence points to a clear link between restricted movement for cows and signs of reduced welfare, so it is common to have some kind of observation practice in place to catch signs of movement struggles."

Traditional methods—which often rely on human observation—can be subjective, sporadic, and time-consuming.

Considering the need for more consistent methods, the study team proposed a novel framework for detecting cow movements, specifically to understand how to measure lying-to-standing transitions from 3D pose estimation data compared with the human eye.

The 2D pose estimation and 3D fusion of 2 cows. The 2D results are displayed at the top, showing the synchronized frames from cameras 0 to 6, onto which predicted bounding boxes and key points are overlaid. The rest of the scene shows the projection of 2 cows from key points in 3D. Cameras 4 and 6 are represented as magenta and gray cuboids, respectively, in the 3D representation, in their spatial position relative to each other and to the cows. A projection of the frames from cameras 4 and 6 (identical to those in the 2D images above) is shown in front of the camera's 3D representation. The 5 other camera representations are not displayed from this angle, and camera 4 occludes the view from camera 0 because of the choice of angle. Only 4 of the key points shown in this figure were used in the study. Credit: Journal of Dairy Science (2024). DOI: 10.3168/jds.2023-24427

The team employed a 24-hour setup of seven cameras recording a herd of Swedish Holstein and Swedish Red cows. This footage was then used with 3D pose estimation software, which tracks and records movements via a 2D object detector and pose estimator.

These datapoints are then fed into convolutional neural networks to detect cow movements in comparison to specific anatomical landmarks on static images from the footage. The result is a 3D map of cows' movement in their stalls and a selection of which movements indicate the transition to standing.

Kroese explained, "We then compared the standing data gathered by the software against timestamps in the video annotated by three human observers, which is considered the gold standard for behavioral observations."

How did the 3D data model hold up in comparison to the human eye? Kroese said, "The framework was able to detect when a cow was transitioning from lying to standing with the same accuracy as humans. The sensitivity of the detection was over 88%."

Notably, the results also indicate that the model introduced no more bias compared with human observers.

Although not without limitations, the study's findings demonstrate the potential of 3D pose estimation to provide objective and reliable data on cow behavior. Kroese noted, "This technology represents an exciting advancement in our ability to study and monitor animal behavior and welfare. By automatically and accurately detecting posture transitions, we can gain valuable insights into the comfort and well-being of dairy cows."

The model offers the potential to help researchers scale up the study of dairy cow behavior and motion patterns and opens the door to the development of new assessment tools for farmers to make informed decisions about their herds.

More information: Adrien Kroese et al, 3-Dimensional pose estimation to detect posture transition in freestall-housed dairy cows, Journal of Dairy Science (2024). DOI: 10.3168/jds.2023-24427

Journal information: Journal of Dairy Science

Provided by Elsevier