Robot house human activity recognition dataset

Mohammad Bamorovat Abadi, Mohamad Reza Shahabian Alashti, Patrick Holthaus, Catherine Menon, Farshid Amirabdollahian

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Downloads (Pure)

Abstract

Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.
Original languageEnglish
Title of host publication4th UKRAS21 Conference: Robotics at home Proceedings
PublisherEPSRC UK-RAS Network
Pages19-20
Number of pages2
DOIs
Publication statusPublished - 15 Jul 2021
EventThe 4th UK-RAS Conference for PhD Students & Early-Career Researchers on 'Robotics at Home' - Online, Hatfield, United Kingdom
Duration: 2 Jun 20212 Jun 2021
Conference number: 4
https://www.ukras.org.uk/news-and-events/uk-ras/

Publication series

NameUKRAS21 Conference: Robotics at home Proceedings
PublisherEPSRC UK-RAS Network
ISSN (Print)2516-502X

Conference

ConferenceThe 4th UK-RAS Conference for PhD Students & Early-Career Researchers on 'Robotics at Home'
Abbreviated title#UKRAS21
Country/TerritoryUnited Kingdom
CityHatfield
Period2/06/212/06/21
Internet address

Fingerprint

Dive into the research topics of 'Robot house human activity recognition dataset'. Together they form a unique fingerprint.

Cite this