Publication:
Multi-modal data fusion for people perception in the social robot haru

dc.contributor.authorRagel de la Torre, Ricardo
dc.contributor.authorRey Arcenegui, Rafael
dc.contributor.authorPáez, Álvaro
dc.contributor.authorPonce Chulani, Javier
dc.contributor.authorNakamura, Keisuke
dc.contributor.authorCaballero, Fernando
dc.contributor.authorMerino, Luis
dc.contributor.authorGómez, Randy
dc.date.accessioned2025-04-02T10:16:20Z
dc.date.available2025-04-02T10:16:20Z
dc.date.issued2023-02-01
dc.descriptionThis work is partially supported by Programa Operativo FEDER Andalucia 2014-2020, Consejeria de Economía y Conocimiento (DeepBot, PY20\_00817) and the project PLEC2021-007868, funded by MCIN/AEI/10.13039/501100011033 and the European Union NextGenerationEU/PRTR.
dc.descriptionProyectos de investigación Proyecto DeepBot (PY20_00817) Proyecto NHoA (PLEC2021-007868)
dc.description.abstractThis article presents a people perception software architecture and its implementation, focused on the information of interest from the point of view of a social robot. The key modules employed to get the different people features, such as the body parts location, the face and hands information, and the speech, from a set of possible devices and configurations are described. The association and combination of these features using a temporal and geometric fusion system are explained in detail. A high-level interface for Human-Robot interaction using the resulting information from the fused people is proposed. The paper presents experimental results evaluating the relevant aspects of the system.
dc.description.sponsorshipDepartamento de Deporte e Informática
dc.format.mimetypeapplication/pdf
dc.identifier.citationR Ragel, R Rey, Á Páez, J Ponce, K Nakamura, F Caballero, L Merino and R Gómez. "Multi-modal Data Fusion for People Perception in the Social Robot Haru", International Conference on Social Robotics, 174-187, 2022
dc.identifier.doi10.1007/978-3-031-24667-8_16
dc.identifier.urihttps://hdl.handle.net/10433/23695
dc.language.isoen
dc.publisherSpringer Nature
dc.rightsAttribution 4.0 Internationalen
dc.rights.accessRightsopen access
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectSocial Robotics
dc.subjectData Fusion
dc.subjectHuman-Robot Interaction
dc.titleMulti-modal data fusion for people perception in the social robot haru
dc.typeconference output
dc.type.hasVersionAM
dspace.entity.typePublication
relation.isAuthorOfPublicationdbd0d93f-ac57-4700-9ad3-4f368ac76c2b
relation.isAuthorOfPublicationceecaf4e-0987-4d7e-9902-8e915fc6b58c
relation.isAuthorOfPublication7c029f9e-5858-4f87-a8c3-a57d36c9dab4
relation.isAuthorOfPublication144853bd-af99-4072-840b-71bdd0b94309
relation.isAuthorOfPublication021f43bc-c25f-40dd-9ac1-0fc2933e7071
relation.isAuthorOfPublication.latestForDiscoverydbd0d93f-ac57-4700-9ad3-4f368ac76c2b

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Haru_Perception_System (2).pdf
Size:
2.39 MB
Format:
Adobe Portable Document Format