The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely-behaving mice, in the dark, over extended periods for continuous acquisition of two parallel video data streams: 1) near-infrared frustrated total internal reflection (FTIR) for detecting the degree, force and timing of surface contact, and 2) simultaneous ongoing video-graphing of whole-body pose. Using machine vision and machine learning we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period of time and scalable for high-throughput pre-clinical analgesic efficacy assessment.