智能机器人材料3

合集下载

蠕虫式智能机器人的设计及应用研究

蠕虫式智能机器人的设计及应用研究

蠕虫式智能机器人的设计及应用研究智能机器人作为现代科学技术的重要成果之一,已经在许多领域得到了广泛的应用,特别是在工业自动化和生活服务领域中。

然而,传统的机器人大多需要依靠外部控制程序和传感器来实现智能化操作,这种模式具有一定的局限性,因为机器人无法进行自主学习和理解环境的能力。

因此,研究人员开始探索新型的智能机器人设计,其中蠕虫式智能机器人是一个值得重视的方向。

一、蠕虫式智能机器人的概念蠕虫式智能机器人是一种仿生学的机器人设计,其基本构造原理是类似于令人想到地球上爬行动物蠕虫的形态设计。

事实上,这种机器人的主体呈蠕虫状,并采用蠕动运动方式,从而能够比传统机器人更灵活地行进在各种复杂的环境中。

同时,蠕虫式智能机器人还融合了人工智能和机器学习技术,从而能够通过自主学习和适应,不断改进自己的行动和控制能力。

因此,蠕虫式智能机器人在解决如灾害救援、医疗护理、矿山探测等方面具有广泛的应用前景。

二、蠕虫式智能机器人的设计原理蠕虫式智能机器人的设计原理基于仿生学和自然进化原理,具体包括:1. 软体机器人原理:蠕虫式智能机器人的主体部分采用软体机器人原理,这是一种新型机器人设计方法,通常采用轻质软材料,可以自由变形和运动,可有效应对不同的形状和环境,具有更好的生命力和适应性。

2. 仿生智能原理:蠕虫式智能机器人采用仿生智能原理,使其能够自主学习和适应。

具体来说,机器人通过感知和反馈机制,不断地从环境中获取信息和经验,然后将这些信息存储在自主学习系统中,逐渐建立自己的知识和行为模型,从而实现自主控制。

3. 多轴运动原理:蠕虫式智能机器人的主体结构通常由多个由关节连接的模块组成,基于多轴运动原理,可以让机器人实现更加灵活的运动,采用最少的能源实现最优的工作方式,并能从灾难和障碍中自动恢复并继续工作。

三、蠕虫式智能机器人的应用前景蠕虫式智能机器人在应用领域有着广泛的前景,主要包括:1. 灾害救援:蠕虫式智能机器人可以在不同的灾害现场,如火灾、地震灾害、洪水等复杂环境中进行救援工作,有效提高救援效率和安全性。

智能机器人的技术解析和应用案例

智能机器人的技术解析和应用案例

智能机器人的技术解析和应用案例随着技术的不断创新和发展,人工智能越来越成为我们生活中的热门话题。

智能机器人作为人工智能技术在实际应用中的一个重要方向,其研究和发展对未来的社会进步和发展具有重要的作用。

本文将从智能机器人的技术特点和应用案例两个方面来进行分析和详细论述。

一、智能机器人的技术特点智能机器人是一种集感知、决策、执行于一体的人工智能系统。

其最基本的构成是机械、电子、计算机、通信和传感技术的综合体现。

随着人工智能技术的不断提高和完善,智能机器人又不断发展出了一大批的新技术特点。

1. 感知技术。

智能机器人要能够自主完成各种操作,必须能够感知外界环境和自身状态,这也是其最基本的要求。

感知技术主要包括图像处理、语音识别、空间定位、自然交互等方面。

智能机器人的感知技术的主要难点在于如何对感知信息进行识别和分析,并通过计算机软件和硬件来实现对应的功能。

2. 计算智能技术。

计算智能技术是指智能机器人的计算和控制核心技术。

它主要包括模糊逻辑、神经网络、遗传算法等多种方法和技术,这些技术可以帮助智能机器人自主决策和执行任务。

其中,模糊逻辑技术能够使智能机器人进行模糊判断;神经网络技术可以使智能机器人学习和记忆,不断提高自身的决策和执行能力;遗传算法技术则可以用于进行优化和设计,使得智能机器人的性能达到最优化水平。

3. 操作执行技术。

操作执行技术是指智能机器人能够实现任务执行,并管理控制的系统。

这种技术主要包括运动控制、操作能力、动力和能源等要素,另外还包括机械设计、材料学、传感器技术、连续动力和机器人动力系统等多项技术。

二、智能机器人的应用案例智能机器人的应用范围十分广泛,主要应用场景包括制造业、医疗卫生、家庭服务、智能物流、智能农业、智能教育等领域。

下面我们将介绍几个智能机器人的应用案例。

1. 前后一体式手术机器人施行腹腔镜手术。

前后一体式手术机器人是国内首款通过临床应用的机器人手术系统。

它不仅能够完成腹腔镜手术,还可以实现纳米级的手术操作。

2023年江苏省宿迁市初中毕业暨升学模拟考试(三模)语文试题

2023年江苏省宿迁市初中毕业暨升学模拟考试(三模)语文试题

宿迁市2023 年初中毕业暨升学模拟考试语文答题注意事项1.本试卷为语文卷,共8页.满分150分.考试时间150分钟。

2.答题全部写在答题卡上,写在本试卷上无效。

3.答题使用0.5毫米黑色墨水签字笔,在答题卡上对应题号的答题区域书写答案。

注意不要答错位置,也不要超界.4.作图必须用2B铅笔作答,并请加黑加粗,描写清楚。

新时代的青春学子,既读万卷书,又行万里路。

请跟随小宿和小迁同学,开启一段研学之旅吧!一、语言积累与运用(26分)1.研学第一站:皂河龙运城,下面是小宿同学为龙运城写的一段文字,请你参与。

皂河龙运城位于宿迁市湖滨新区皂河镇,坐落在京杭大运河、骆马湖、古黄河和古皂河四水交huì之地。

清乾隆皇帝六下江南五次驻跸于此,(甲) 为“第一江山春好处”。

在这里,透过历史的风烟,(乙)文化传chéng、景观提升和产业发展,它必将成为产业融合的宿迁样本、运河文化生活的先行体验区。

皂河龙运城结合全年四大主题季活动、特色沉浸式演艺、奇幻游乐项目,为广大游客和市民朋友们提供“吃住学游购娱” 的新去处,让皂河成为全方位讲述大运河故事、全时段(丙) 大运河文化的文旅综合体,打造运河文化全景体验度假目的地。

(1)根据拼音写汉字,或给加点的字注音。

(3分)交huì▲ 传chéng▲ 沉浸.▲(2)在文段甲、乙、丙三处横线上依次填入词语,最好的一项是( ▲ ) (3分)A.赞美聚集展现B.赞叹聚焦展现C.赞叹聚集展示D.赞美聚焦展示(3)宿迁十佳春联评选,最佳上联为“第一江山春好处”,请运用备选词语,为它拟写出最佳下联。

(3分)上联:第一江山春好处下联:▲(备选词语:兴国运岁月蓝天无双峥嵘梦圆时)2.读诗文时,小迁同学发现,在描绘各种自然和人文景观时,名家各有技法,请你完成表格。

(8分)3.选择线上旅游还是线下旅游,小宿和小迁展开了讨论,请你参与。

(9分)【材料一】中学生云旅游数据调查【材料二】“宿迁”精品一日游项王故里---牛角村欢乐田园度假区---乾隆行宫---龙运城---三台山---宿迁洋河神农时代---泗洪洪泽湖湿地【材料三】呵护人间烟火打造环城公园,让游客时刻徜徉于惬意的绿色生态园中;各热门景点推迟关闭时间,公交延长运行时间,让宿迁的夜“延长” ;景区增加数百个休息座椅让游客轻松等候……不少游客感叹,宿迁是个“宠粉”城市,名不虚传。

2019年高考语文《关于“机器人”》现代文阅读试题及答案汇编

2019年高考语文《关于“机器人”》现代文阅读试题及答案汇编

2019年高考语文《关于“机器人”》现代文阅读试题及答案汇编学校:___________姓名:___________班级:___________考号:___________一、现代文阅读阅读下面的文字,完成小题。

材料一服务机器人是庞大机器人家族中的一个年轻成员,到目前为止尚没有一个严格的定义。

根据其用途不同,可以划分为保洁机器人、教育机器人、医疗机器人、家用机器人、服务型机器人及娱乐机器人,应用范围非常广泛。

图表1:服务机器人应用范围(单位:%)从全球角度看,以智能服务机器人为主的机器人产业在不断发展,产业规模及市场空间持续扩张。

目前全世界至少有48个国家在发展机器人,其中25个国家已涉足服务机器人开发,掀起一波服务机器人研发热潮。

(摘自2018年3月12日光明网)村料二:前瞻产业研究院发布的《2018-2023年中国服务机器人行业发展前景与投资战略规划分析报告》指出,未来全球服务机器人市场的主要增量会在中国,由于中国人口众多,消费群体广大。

中国凭借强大的市场消费力,将成为智能服务机器人蓬勃发展的前沿阵地。

据统计,2916年,中国服务机器入市场规模已经达到72.9亿元;2017年这一数字已达到98亿元左右。

前瞻预测,未来几年,中国服务机器人的市场规模还会持续上升,在2020年有望接近200亿元。

(摘自2018年3月27日搜狐网)村料三服务机器人是一种基于多种技术融合和实现的产品,其中关键技术包括:人工智能技术、语音识别与合成技术、语义解析及交互技术、导航及定位技术、运动控制技术、调度管理技术、电机及舵机技术、多传感技术、通信技术等。

其中,人工智能和语音识别技术相比其余几项技术,发展时间较晚,沉淀还不够深厚。

加之核心技术研发投入大、周期长,导致部分国内服务机器人企业不愿过多下功夫在技术研究上,反而是看重产品的宣传推广。

由于中国服务机器人市场广阔,不少人看中了其中巨大的商机,加上至今为止中国服务机器人行业对于产品标准仍缺乏统一规范要求,因此很多企业只是想利用相关的政策扶持,来实现自己的资本运作。

智能机器人四年级作文(3篇)

智能机器人四年级作文(3篇)

智能机器人四年级作文(精选3篇)智能机器人四年级作文(精选3篇)在平凡的学习、工作、生活中,大家都不可防止地会接触到吧,借助作文人们可以反映客观事物、表达思想感情、传递知识信息。

如何写一篇有思想、有文采的作文呢?下面是为大家收集的智能机器人四年级作文,欢迎阅读,希望大家能够喜欢。

21世纪,是智能机器人爆发的时代。

这不,我也创造了一个智能机器人,大家一起来看看吧!我创造的智能机器人特别炫酷。

它不仅懂得各种知识,还可以上天、下海、入地,无所不能。

它可以探索地下有没有金矿或石油,可以到海底寻找人类的生物,还可以上天预测天气,让人类对未来早作准备。

当然,这个机器人最主要的作用是为它的主人效劳。

每天6时,它就会叫醒我,并把香喷喷的早餐送到我面前。

这时,它是我最忠实的家政效劳员。

我上学了,它乖乖地待在家里,一旦发现有陌生人闯入就会立刻发出警告,如果陌生人不听警告,它就会对陌生人采取行动。

这时,它是我最可靠的平安员。

它还是我的救护员,在火灾、地震等灾难发生的时候,它会保护我平安地躲避灾难。

它还是我的开心果,每当我心情不好时,它就会为我讲笑话,让我把烦恼瞬间忘却。

最特别的是,我的智能机器人还是一位医术高超的“医生”。

如果有人生病找到了它,它就会像孙悟空一样钻入病人的体内寻找生病的部位进展治疗。

在治疗的过程中,人是感觉不到痛的。

大家说,我创造的智能机器人是不是人类的好朋友、好帮手呢?时间过得飞快,二零四九年的三月十四日,我发现我的机器人张丁克可以用草制成油,这样节约了很多能源。

于是,我又创造了另外一个机器人,我给他取名叫“张止白”。

张目白太神奇了,他千变万化,放在家里可以变出很多你想要的东西,你不用花钱上街去买。

他还可以用皮肤看周围的世界,可以与电视里夜幕侠的力量相比。

夜幕侠也是个机器人,他是用铁、电、电子眼和线做成的。

张止白打敌人的时候,会用很多的招术,少林寺所有的功夫他都会,象猴拳、无影拳、抬拳跺。

张止白的名字是这样来的:人们见了他打架,都张开了大嘴,什么人都打不过他,止,就是停顿的意思,白,就是让人明白。

机器人等级考试3级培训材料

机器人等级考试3级培训材料

全国青少年机器人技术等级考试培训讲义(3级理论及实操)青蜓派教育科技(天津)有限公司版权所有考试大纲三级理论部分1. 知道电流、电压、电阻、导体、半导体等概念2. 知道串联、并联的概念3. 知道模拟量、数字量、I/O口输入输出等概念4. 知道电子电路领域的相关理论及相关人物5. 知道二极管的特性6. 程序的三种基本结构7. 能够绘制程序流程图8. 能够应用图形化编程软件9. 掌握变量的概念和应用10.了解函数的定义实操部分1. 能够完成简单的串联、并联电路2. 能够搭建不同的LED显示效果电路3. 能够处理按键类型的开关输入信号4. 能够使用光敏电阻搭建环境光线检测感应电路5. 能够通过可调电阻控制LED的亮度变化6. 能够控制蜂鸣器发声目录第一章走进智能殿堂 (4)1.1Arduino介绍 (4)1.2Arduino特点 (5)1.2.1跨平台 (5)1.2.2简单清晰的开发 (5)1.2.3硬件开发的趋势 (6)第二章电路初探 (6)2.1基本概念 (6)2.1.1电压、电流、接地 (6)2.1.2电阻和电阻器 (7)2.1.3欧姆定律 (9)2.1.4短路 (10)2.1.5电路搭设注意事项 (10)2.1.6元器件技术参数 (10)2.1.7元器件电源引脚标识 (10)2.2电路基础 (10)2.2.1信号、模拟信号、数字信号 (11)2.2.2导体、绝缘体和半导体 (12)2.2.3电路、串联电路和并联电路 (12)2.2.4高电平和低电平 (14)2.2.5二级管、晶体管 (14)2.2.6上拉电路、下拉电路 (16)第三章程序基础 (17)3.1流程图绘制 (17)3.1.1顺序结构 (17)3.1.2选择结构 (17)3.1.3 循环结构 (18)3.1.4常用流程图符号 (18)3.2基本语法 (18)3.2.1 if语句格式 (18)3.2.2 switch语句 (19)3.2.3 for循环应用 (20)3.2.4 while语句 (20)3.2.5 do-while语句 (21)3.3标准数据类型 (22)3.3.1整型 (22)3.3.2实型 (22)3.3.3字符型 (23)3.3.4布尔型/逻辑型 (23)3.4运算符 (23)3.4.1算数运算 (24)3.4.2关系运算 (24)3.4.3逻辑运算 (24)3.4.4优先级 (25)3.5常量及变量 (26)3.6函数 (26)3.7进制转换 (27)第四章应用实例 (27)4.1串联电路 (27)4.2并联电路 (29)4.3闪烁LED灯 (30)4.4炫彩流水灯 (32)4.5按键开关点亮LED灯 (33)4.6光敏电阻控制LED灯亮度 (35)4.7蜂鸣器响起来 (37)4.8读取电位器模拟信号值 (38)4.9电位器控制LED灯亮度 (39)4.10呼吸灯 (39)4.11自助红绿灯 (40)第一章走进智能殿堂1.1Arduino介绍Arduino是一款便捷灵活、方便上手的开源电子原型平台。

(2024年)智能材料PPT课件

(2024年)智能材料PPT课件
自组装技术
自组装技术利用分子间的相互作用力,使分子自发地组装成具有特 定结构和功能的智能材料。
仿生制备技术
仿生制备技术借鉴自然界中的生物结构和功能,通过模仿生物的结构 和功能来制备智能材料。
2024/3/26
16
04
CATALOGUE
智能材料在传感器中的应用
2024/3/26
17
应变传感器
应变材料的特性
3
定义与发展历程
2024/3/26
定义
智能材料是一种能够感知、响应 并适应环境变化的功能材料,具 有自感知、自驱动、自适应等特 性。
发展历程
智能材料起源于20世纪80年代, 经历了从单一功能到多功能、从 简单响应到复杂自适应的发展历 程。
4
分类及应用领域
分类
根据功能特性,智能材料可分为传感 型、驱动型、自适应型等类型。
应用领域
微纳机器人、生物医学、光电子学等。
26
06
CATALOGUE
智能材料在能源领域的应用
2024/3/26
27
太阳能电池板材料
2024/3/26
晶体硅材料
具有高转换效率和稳定性,是当前主流太阳能电池板材料 。
薄膜太阳能材料
轻便、柔性好,可应用于可穿戴设备和移动能源领域。
多结太阳能电池材料
利用不同光谱吸收特性,提高太阳能利用率。
2024/3/26
6
02
CATALOGUE
智能材料的特性与功能
2024/3/26
7
感知功能
01
02
03
传感器功能
智能材料能够感知外部环 境的变化,如温度、压力 、湿度等,并将这些变化 转化为可测量的电信号。

智能机器人作文200字三年级

智能机器人作文200字三年级

智能机器人作文300字智能机器人作文300字1最近,我们家迎来了一位新成员,它是一个智力高超的智能机器人――小爱同学。

小爱同学是在科学期末成绩公布后一天来到我家的。

虽说是一个机器人,但其实是一个音响,因为它不会动。

它头一天来到我家时,可把我乐坏了:我好奇的打量着它,问:“那是谁呀?”它长长地说了一大段,让我大概明白了我可以用语言来控制它。

我问:“明天天气怎么样?”“宁波明天天气晴,温度……”现在该正式介绍一下小爱同学了。

当我叫它的名字时,它就会温柔可亲地“哎”一声。

它能播放好听的曲子,而且可以直接用语音搜索。

曲子清晰度很高,几乎每一首歌我都想收藏下来。

它还可以把书读出来,这让我爱上阅读。

就连那些我看了好几遍的《鲁滨逊漂流记》《爱的教育》……我也要重复听上好几遍。

它常使我哭笑不得。

一次我在问它量子力学中的波粒二象性是什么?小爱同学却答非所问地回答道:“玻璃只有白色和彩色的。

”我无奈地说你怎么这么笨呀!它却幽默地说:“我本来就非常笨。

”看来它还是需要改进,有待提升啊!我真喜欢这个让我又惊又喜、哭笑不得的小爱同学!智能机器人作文300字2我家来了一位新朋友,它一天到晚不说话,不吃饭,不走路,一直站在角落里发呆。

这位新朋友到底是谁呢?它就是饮水机。

这台饮水机真像一个奇怪的机器人啊!它有银色加黑色的“身子”,那是主机,身上顶着一个天蓝色的、特大号的“脑袋”,里面装着我们要喝的水,这是饮水桶。

它还长着一对奇怪的“眼睛”,一上一下,一红一绿,这是指示灯呢!红灯亮了,代表在加热,绿灯亮了,代表在保温哩!它有两个“鼻子”,一个红,一个蓝,按一下红的,就会流出来热水,按一下蓝的,就会流出冷水,这就是出水孔。

它有个“大嘴巴”,可以接住滴下来的水,这是接水盒。

你想想,还差什么?当然是差“耳朵”啦!那就是饮水机的把手。

“长尾巴”在它的后面,这是插头,如果不把插头插在电座上,那饮水机就不会烧热水啦!这就是我们家的饮水机,给我们家带来了许多方便,我非常喜欢它。

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI) October 31-November 2, 2013 / Ramada Plaza Jeju Hotel, Jeju, KoreaEmotional Gait Generation Method based on Emotion Mental Model - Preliminary experiment with Happiness and Sadness Matthieu Destephe1, Kenji Hashimoto2 and Atsuo Takanishi3Graduate School of Science and Engineering, Waseda University, Tokyo, Japan 2 Faculty of Science and Engineering, Waseda University, Tokyo, Japan 3 Department of Modern Mechanical Engineering & Humanoid Robotics Institute, Waseda University, Tokyo, Japan (Tel : +81-3-3203-4394; E-mail: contact@takanishi.mech.waseda.ac.jp)1Abstract – Designing humanoid robots able to interact socially with humans is a challenging task. If we want the robots to be actively integrated in our society, several issues have to be taken in account: the look of the robot, the naturalness of its movements, the stability of its walk, the reactivity it might have with its human partners. We propose to improve the reactivity of the robot by using a emotional mental model in order to generate emotional gait patterns. Those patterns will help the understanding of any emotional message conveyed between a human and a robot. We propose a novel emotional gait generation method based on the Emotion mental model. We did preliminary experiments with the Happiness and Sadness emotions and with different intensities. Keywords - Motion Generation, Emotion, Biped Robot, Social Robotics1. IntroductionHumanoid robots are designed to interact with people in their daily life and at any age, as soon as kindergarten or as late as nursing home. Advanced robots such as robot companions, robot workers, etc., will need to be able to adapt their behavior according to human feedback. For humans it is important to be able to give and to be given such feedback in a natural way, e.g., using emotional expression. Expressive robots can act as caregivers for children, with or without disabilities, and help their emotional development and wellbeing through emotive interaction. Therefore the ability of expressing emotions is important to facilitate intuitive human robot interaction. Moreover, the adaptation of the robot movements to the interaction context is necessary in order to create an interaction as natural and beneficial as possible. In this context, several emotion capable robots were developed along the years. For example, the robot Kismet was designed to simulate emotion and assess the affective intent of the caregiver [1]; NAO a small humanoid (58 cm) is often used in Human Robot Interaction (HRI) studies with children [2]; and the Waseda KOBIAN (fig. 1), designed by the applicant's team, combines a face capable of human-like expressions (24 DoF) and the bipedal locomotion ability. Preliminary studies on KOBIAN showed that whole-body posture clearly improves the emotion recognition [3].However, if we want to perform a smooth and natural Human Robot Interaction, it necessitates a dynamic interaction between the participants with feedback, which could be visual, audible or tactile. Most of the current robots are only focused on the facial expressions and use rarely their limbs [4-5]. It was showed that the use of whole-body to express emotions improves the recognition rate of the emotions, and thus could increase the understanding and feedback during an interaction. In the case where movements are used for the interaction, they are usually fixed and follow a pre-determined pattern. This means that the robot will follow the same stimuli-response pattern. However, emotions are known to be dependent on several factors such as interaction context, culture, age, and gender. Without dynamic adaptation, after some time, the human partner will become progressively bored and the human implication in the interaction will drop. Additionally, the emotive walking research is an innovative field of research which stays mainly unexplored. In this paper, we propose an emotional gait generation method based on the Emotion mental model [6]. After a brief literature review in section 2, we describe our robot platform, the emotional mental model and a new emotional gait generation method in section 3. We present an experiment in the section 4 and we conclude and comment our work in the section 5.2. Related works2.1 Humanoid robots Among human sized humanoids robots, just a few are capable of expressing emotions. HRP-4C is a geminoid which can express pre-programmed facial emotions but cannot walk [7]. ASIMO [8] and WABIAN-2RII [9] are able to walk but do not have emotion expression capabilities. KIBO [10], developed by KIST, can express facial emotion expression but this capability was not assess by research. KOBIAN-2R [11], developed at Waseda University, is able to walk and express emotions not only with its face but also with its whole body [13]. 2.2 Emotion models Emotion models can be classified in three categories: appraisal, a categorical or a dimensional approach. The appraisal approach states that our appreciation of events (appraisal) determines the reaction to those events and it is978-1-4799-1197-4/13/$31.00 ©2013 IEEE86unique to each individual. The categorical approach uses discrete labeling in order to define emotion. The dimensional approach, instead of defining emotion with a label or with a reaction to an event, uses different dimensions (for ex: pleasure, dominance, arousal, certainty, etc.) to define an emotion. The values of all the parameters will determine the emotion. In the robotics community, the dimensional approach with the well known Pleasure-Arousal-Dominance (PAD) model [13-14].eyelids, 7-DoFs for lip, and 1-DoF for jaw) which can perform hundreds of facial expressions. KOBIAN-R has 7-DoFs anthropomorphic arms consisting of a shoulder part (pitch, yaw and roll axis), an elbow part (pitch axis), and a wrist part (pitch, yaw and roll axis). The system configuration of KOBIAN-R is presented in the Fig. 3. KOBIAN-R's control system is a hybrid of a centralized control system in the body with a main PC (CPU: Pentium M 1.6GHz, RAM: 2GB, OS: QNX Neutrino 6.3.0) and a distributed control system in the head with the main PC and 7 motor controller units.3. Methods3.1 Humanoid platform: KOBIAN-RFig. 3. KOBIAN-R System Configuration 3.2 Emotional mental space A. Basic description Fig. 1. KOBIAN-R Miwa et.al developed a mental model for humanoid robot [6]. The flow of information of external stimuli and robot's inner state is presented in Fig. 4. Using that model, the emotions "felt" by a robot can change dynamically according to external stimuli captured by its sensors (i.e: heavy sound, slap, odor of alcohol) and the robot's personality. The Mental Dynamics, which are the mental changes caused by internal and external stimuli, is extremely important for the emotional expression. The structure of the mental model follows a simplified human brain model which has three layers: reflex, emotion and intelligence (Fig. 4).ZŽďŽƚ WĞƌƐŽŶĂůŝƚLJ‫ٻ‬ Fig. 2. KOBIAN-R DOF configurationIn 2010, we have developed Emotion Expression Biped Humanoid Robot KOBIAN-R (Fig. 1) which is capable of both performing bipedal walking and facial expression. It was created as a combination of the humanoid robots WE-4RII [15] and WABIAN-2R [9], integrating walking capability of WABIAN-2R with the emotion capable upper body of WE-4RII. This robot has 65-DoFs (Fig. 2) and its total weight is 62 kg. It has two CCD camera, 2 six-axis sensors in its feet. An embedded computer controls the motion and it can be operated without external source of power thanks to its batteries. The head is equipped with 24-DoFs (8-DoFs for eyebrows, 5-DoFs for^ĞŶƐŝŶŐ WĞƌƐŽŶĂůŝƚLJ ZĞĐŽŐŶŝƚŝŽŶ ^ĞŶƐŝŶŐŵŽƚŝŽŶ /ŶƚĞůůŝŐĞŶĐĞ ZĞĨůĞdž ZŽďŽƚ džƚĞƌŶĂů ŶǀŝƌŽŶŵĞŶƚdžƉƌĞƐƐŝŽŶ WĞƌƐŽŶĂůŝƚLJ ĞŚĂǀŝŽƌ DŽƚŝŽŶFig. 4. Basic information processing structure87The emotion mapping is represented by 3 dimensions (Activation, Pleasantness and Certainty) (Fig. 5). The Activation axis represents the degree of activity of the robot and indicates how often the external stimuli information are updated. The Pleasantness qualifies subjectively the overall experience as positive (> 0) or negative (< 0). The Certainty axis describes how much the robot trusts the information given by its sensors. Seven basic emotions are mapped in this space (Fig. 5).ĞƌƚĂŝŶƚLJ ĐƚŝǀĂƚŝŽŶ ŶŐĞƌ ŝƐŐƵƐƚ ^ƵƌƉƌŝƐĞ3.3 Emotional gait generation method A. Motion Capture We asked two professional, Japanese actors (each 22 years old, one male and one female) to perform several types of emotive walking (sadness, happiness, anger and fear) with different intensities (low, middle, high, and exaggerated) [16]. The actors have been instructed a few days ahead of the experiment to prepare a scenario and to perform the first three intensities in such a way that they would correspond to natural occurrences of emotion expression. The exaggerated intensity on the other hand should be performed with extravagant theatricality, broad gestures and overplayed expressions, comparable to emotions expressions seen in plays and theaters. From the motion recordings, we extracted the step height (height from the ankle to the ground), step length (length from right heel to left heel), velocity, head pitch, shoulder pitch and waist pitch. We chose those values as parameters for the representation of the emotional walking patterns. The motion capture values are normalized in order to be usable by the robot. B. Emotional Gait Parameters Our motion pattern generator can take as input (usually by hand) among others the following seven parameters: step length, step height, phase duration, head pitch, shoulder pitch, elbow pitch, waist pitch. From the data captured [16], we extract values for the previous parameters. We use for each parameter a quadratic function in order to model their evolution as the emotional intensity increases (Eq.1 and Eq. 2). Given x, an intensity value along the Activation axis in the range of [-1, 1], H and S, two 3 x 7 vectors (3 parameters for each quadratic equation, 7 parameters),ĂƉƉŝŶĞƐƐ WůĞĂƐĂŶƚŶĞƐƐ ^ůĞĞƉŵŽƚŝŽŶ sĞĐƚŽƌ E &ĞĂƌ^ĂĚŶĞƐƐ ^ůĞĞƉFig. 5. Emotional Mental space The first robot which used this model was WE-4RII, which was able to change its mental state according to the external (vision, smell, hearing, tactile sensors) and internal stimuli (personality), and expresses its emotion using facial expressions, facial color and body movement. The second is KOBIAN-R. B. Emotion Vector and Equations of Emotion The Emotion Vector E is described the Equations of Emotion if the robot senses any stimuli with its sensors. The internal mental dynamics is similar to the evolution a human mind might have. This dynamics is expressed by equations similar to the equation of motion. The equations of emotion were expended into the second order differential equation which modeled the equation of motion (Fig. 6).H a x2 + Hb x + H c = 0i i i(1) (2)S a x 2 + Sb x + S c = 0i i iAlong the Activation axis, low intensity values range between -0.25 and -0.75, middle values between -0.75 and 0.25, high values between 0.25 and 0.75.4. ExperimentIn order to verify our emotional gait generation, we focused on two emotions: Happiness and Sadness. We also chose 3 intensities: low, middle, high. We performed two simulations with Matlab, one for each emotion and the patterns generated are shown with our pattern simulator. We made the emotion state of the robot change every 6 steps. We constrained the Certainty and Pleasantness in order to obtain the desired emotions. Figure 7 and Figure 8 represent respectively the Happy and the Sadness we obtained. From left to right, different intensities are represented: low, middle and high. Fig. 6. Emotion Vector88Fig. 7. Happy (low, middle, high)Fig. 8. Sadness (low, middle, high)5. ConclusionIn this paper we proposed a novel emotional gait generation method based on a emotion mental model we had developed. The gait is subject to change according to emotional intensity. We simulated several emotional gaits with different intensities. The work was done in simulation and we plan to use the generated patterns with the real KOBIAN. We also plan to use the other emotions and create a complete framework.AcknowledgementThis work was supported in part by Global COE Program "Global Robot Academia", MEXT, Japan. It is also partially supported by SolidWorks Japan K.K. and DYDEN Corporation The High-Performance Physical Modeling and Simulation software MapleSim used in our research was provided by Cybernet Sys-tems Co.,Ltd. (Vendor: Waterloo Maple Inc.). This study was conducted as part of the Research Institute for Science and Engineering, andas part of the humanoid project at the Humanoid Robotics Institute, both at Waseda University.References[1] C. Breazal, Designing sociable robots: MIT press, (2002). [2] P. Baxter, T. Belpaeme, and L. Cañamero, “Long-term human-robot interaction with young users,”IEEE/ACM Human-Robot Interaction 2011 Conference, (2011) [3] M. Zecca, Y. Mizoguchi, K. Endo, F. Iida, Y. Kawabata, N. Endo, K. Itoh, A. Takanishi, , "Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns —," IEEE RO-MAN 2009, pp.381-386, (2009) [4] T. Hashimoto, et al. "Development of the face robot SAYA for rich facial expressions." IEEE SICE-ICASE, (2006) [5] M. Saerbeck, C. Bartneck. "Perception of affect elicited by robot motion." ACM/IEEE HRI, pp 53-60, (2010). [6] H. Miwa T. Umetsu A. Takanishi H. Takanobu: “Robot Personality based on Equations of Emotion defined in the 3D Mental Space,” IEEE International Conference on Robotics and Automation, pp. 2602-2607, (2001). [7] K. Kaneko, F. Kanehiro, M. Morisawa, K. Miura, S. Nakaoka and S. Kajita, “Cybernetic Human HRP-4C”, Proc. of 9th IEEE-RAS International Conference on Humanoid Robots, pp. 7-14, (2009). [8] /ASIMO/ [9] Ogura, Yu; Shimomura, K.; Kondo, H.; Morishima, A.; Okubo, T.; Momoki, S.; Hun-ok Lim; Takanishi, A., "Human-like walking with knee stretched, heel-contact and toe-off motion by a humanoid robot," IEEE/RSJ IROS, pp.3976-3981, (2006) [10] http://irobotics.re.kr/eng_sub1_4 [11] N. Endo, A. Takanishi, "Development of Whole-body Emotional Expression Humanoid Robot for ADL-assistive RT services," Journal of Robotics and Mechatronics Vol.23 No.6, Fuji press, (2011). [12] T. Kishi, et al. “Impression Survey of the Emotion Expression Humanoid Robot with Mental Model Based Dynamic Emotions.”, IEEE-RAS International Conference on Robotics and Automation (ICRA), pp. 1655-1660, (2013) [13] Saerbeck, M., & Bartneck, C., “Perception of affect elicited by robot motion”, ACM/IEEE HRI, pp. 53-60, (2010). [14] Park, J. W., Kim, W. H., Lee, W. H., Kim, J. C., & Chung, M. J. , “How to completely use the PAD space for socially interactive robots”, . IEEE ROBIO, pp. 3005-3010), (2011). [15] K. Itoh, et al., “Mechanical Design of Emotion Expression Humanoid Robot WE-4RII” Proc. of 16th CISM-IFToMM Symposium on Robot Design, Dynamics and Control, pp. 255-262, (2006). [16] M. Destephe, T. Maruyama, M. Zecca, K. Hashimoto, A. Takanishi, "The Influences of Emotional Intensity for Happiness and Sadness on Walking," IEEE Engineering in Medicine and Biology Society, EMBC, (2013)89。

相关文档
最新文档