Sign In

A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Representative Results
  • Discussion
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

This protocol provides a method for tracking automated eye squint in rodents over time in a manner compatible with time-locking to neurophysiological measures. This protocol is expected to be useful to researchers studying mechanisms of pain disorders such as migraine.

Abstract

Spontaneous pain has been challenging to track in real time and quantify in a way that prevents human bias. This is especially true for metrics of head pain, as in disorders such as migraine. Eye squint has emerged as a continuous variable metric that can be measured over time and is effective for predicting pain states in such assays. This paper provides a protocol for the use of DeepLabCut (DLC) to automate and quantify eye squint (Euclidean distance between eyelids) in restrained mice with freely rotating head motions. This protocol enables unbiased quantification of eye squint to be paired with and compared directly against mechanistic measures such as neurophysiology. We provide an assessment of AI training parameters necessary for achieving success as defined by discriminating squint and non-squint periods. We demonstrate an ability to reliably track and differentiate squint in a CGRP-induced migraine-like phenotype at a sub second resolution.

Introduction

Migraine is one of the most prevalent brain disorders worldwide, affecting more than one billion people1. Preclinical mouse models of migraine have emerged as an informative way to study the mechanisms of migraine as these studies can be more easily controlled than human studies, thus enabling causal study of migraine-related behavior2. Such models have demonstrated a strong and repeatable phenotypic response to migraine-inducing compounds, such as calcitonin-gene-related peptide (CGRP). The need for robust measurements of migraine-relevant behaviors in rodent models persists, especially those that may be coupled with me....

Protocol

NOTE: All animals utilized in these experiments were handled according to protocols approved by the Institutional Animal Care and Use Committee (IACUC) of the University of Iowa.

1. Prepare equipment for data collection

  1. Ensure the availability of all necessary equipment: ensure that the recommended hardware for running DLC has at least 8 GB of memory. See the Table of Materials for information related to hardware and software.
    NOTE: Data can.......

Representative Results

Here, we provide a method for the reliable detection of squint at high temporal resolution using DeepLabCut. We optimized training parameters, and we provide an evaluation of this method's strengths and weaknesses (Figure 1).

After training our models, we verified that they were able to correctly estimate the top and bottom points of the eyelid (Figure 2), which serve as the coordinate points for the Euclidean distance measure. Eu.......

Discussion

This protocol provides an easily accessible in-depth method for using machine-learning-based tools that can differentiate squint at near-human accuracy while maintaining the same (or better) temporal resolution of prior approaches. Primarily, it makes evaluation of automated squint more readily available to a wider audience. Our new method for evaluating automated squint has several improvements compared to previous models. First, it provides a more robust metric than ASM by utilizing fewer points that actually contribut.......

Acknowledgements

Thanks to Rajyashree Sen for insightful conversations. Thanks tothe McKnight Foundation Neurobiology of Disease Award (RH), NIH 1DP2MH126377-01 (RH), the Roy J. Carver Charitable Trust (RH), NINDS T32NS007124 (MJ), Ramon D. Buckley Graduate Student Award (MJ), and VA-ORD (RR&D) MERIT 1 I01 RX003523-0 (LS).

....

Materials

NameCompanyCatalog NumberComments
CUDA toolkit 11.8
cuDNN SDK 8.6.0
Intel computers with Windows 11, 13th gen 
LabFaceX 2D Eyelid Tracker Add-on Module for a Free Roaming Mouse:FaceX LLCNAAny camera that can record an animal's eye is sufficient, but this is our eye tracking hardware.
NVIDIA GPU driver that is version 450.80.02 or higher
NVIDIA RTX A5500, 24 GB DDR6NVIDIA[490-BHXV]Any GPU that meets the minimum requirements specified for your version of DLC, currently 8 GB, is sufficient. We used NVIDIA GeForce RTX 3080 Ti GPU
Python 3.9-3.11
TensorFlow version 2.10

References

Explore More Articles

MedicineAutomated behaviorfacial grimacesquintpainmigrainemouse behavior

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2024 MyJoVE Corporation. All rights reserved