Abstract
Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.Human activity recognition is one of the most challenging tasks in computer vision. State-of-the art approaches such as deep learning techniques thereby often rely on large labelled datasets of human activities. However, currently available datasets are suboptimal for learning human activities in companion robotics scenarios at home, for example, missing crucial perspectives. With this as a consideration, we present the University of Hertfordshire Robot House Human Activity Recognition Dataset (RH-HAR-1). It contains RGB videos of a human engaging in daily activities, taken from four different cameras. Importantly, this dataset contains two non-standard perspectives: a ceiling-mounted fisheye camera and a mobile robot's view. In the first instance, RH-HAR-1 covers five daily activities with a total of more than 10,000 videos.
Original language | English |
---|---|
Title of host publication | 4th UKRAS21 Conference: Robotics at home Proceedings |
Publisher | EPSRC UK-RAS Network |
Pages | 19-20 |
Number of pages | 2 |
DOIs | |
Publication status | Published - 15 Jul 2021 |
Event | The 4th UK-RAS Conference for PhD Students & Early-Career Researchers on 'Robotics at Home' - Online, Hatfield, United Kingdom Duration: 2 Jun 2021 → 2 Jun 2021 Conference number: 4 https://www.ukras.org.uk/news-and-events/uk-ras/ |
Publication series
Name | UKRAS21 Conference: Robotics at home Proceedings |
---|---|
Publisher | EPSRC UK-RAS Network |
ISSN (Print) | 2516-502X |
Conference
Conference | The 4th UK-RAS Conference for PhD Students & Early-Career Researchers on 'Robotics at Home' |
---|---|
Abbreviated title | #UKRAS21 |
Country/Territory | United Kingdom |
City | Hatfield |
Period | 2/06/21 → 2/06/21 |
Internet address |