Journal article Open Access

Gesture formation: A crucial building block for cognitive-based Human–Robot Partnership

Pietro Morasso


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-itmirror</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by-nc-nd/4.0/</subfield>
    <subfield code="a">Creative Commons Attribution-NonCommercial-NoDerivatives</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="a">Pietro Morasso</subfield>
  </datafield>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Gesture formation: A crucial building block for cognitive-based Human–Robot Partnership</subfield>
  </datafield>
  <controlfield tag="005">20230924083045.0</controlfield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">General Medicine</subfield>
  </datafield>
  <controlfield tag="001">83595</controlfield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">article</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2021-01-01</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">Abstract The next generation of robotic agents, to employed both in industrial and service robotic applications, will be characterized by a high degree of Human–Robot Partnership that implies, for example, sharing common objectives, bidirectional flow of information, capability to learn from each other, and availability to mutual training. Moreover, there is a widespread feeling in the research community that probably Humans will not accept Robots as trustable Partners if they cannot ascribe some form of awareness and true understanding to them. This means that, in addition to the incremental improvements of Robotic-Bodyware, there will be the need for a substantial jump of the Robotic-Cogniware, namely a new class of Cognitive Architectures for Robots (CARs) that match the requirements and specific constraints of Human–Robot Partnership. The working hypothesis that underlies this paper is that such class of CARs must be bio-inspired, not in the sense of fine-grain imitation of neurobiology but the large framework of embodied cognition. In our opinion, trajectory/gesture formation should be one of the building blocks of bio-inspired CARs because biological motion is a fundamental channel of inter-human partnership, a true body language that allows mutual understanding of intentions. Moreover, one of the main concepts of embodied cognition, related to the importance of motor imagery, is that real (or overt) actions and mental (or covert) actions are generated by the same internal model and support the cognitive capabilities of human skilled subjects. The paper reviews the field of human trajectory formation, revealing in a novel manner the fil rouge that runs through motor neuroscience and proposes a computational framework for a robotic formulation that also addresses the Degrees of Freedom Problem and is formulated in terms of the force-field-based Passive Motion Paradigm.</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">3875760</subfield>
    <subfield code="u">https://www.openaccessrepository.it/record/83595/files/fulltext.pdf</subfield>
    <subfield code="z">md5:3bb5c688eead501ad70c2446179cd53a</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.1016/j.cogr.2021.06.004</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
</record>
0
0
views
downloads
Views 0
Downloads 0
Data volume 0 Bytes
Unique views 0
Unique downloads 0

Share

Cite as