219-7677
10
7500817
John Benjamins Publishing Company
Marketing Department / Karin Plijnaar, Pieter Lamers
onix@benjamins.nl
201611101209
ONIX title feed
eng
01
EUR
3007632
03
01
01
JB
John Benjamins Publishing Company
01
JB code
NLP 8 Eb
15
9789027288400
06
10.1075/nlp.8
13
2009048316
DG
002
02
01
NLP
02
1567-8202
Natural Language Processing
8
01
Close Engagements with Artificial Companions
Key social, psychological, ethical and design issues
01
nlp.8
01
https://benjamins.com
02
https://benjamins.com/catalog/nlp.8
1
B01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
University of Oxford
01
eng
340
xxii
315
COM004000
v.2006
UYZ
2
24
JB Subject Scheme
CONS.GEN
Consciousness research
24
JB Subject Scheme
LIN.COMPUT
Computational & corpus linguistics
24
JB Subject Scheme
PHIL.GEN
Philosophy
06
01
What will it be like to admit Artificial Companions into our society? How will they change our relations with each other? How important will they be in the emotional and practical lives of their owners – since we know that people became emotionally dependent even on simple devices like the Tamagotchi? How much social life might they have in contacting each other? The contributors to this book discuss the possibility and desirability of some form of long-term computer Companions now being a certainty in the coming years. It is a good moment to consider, from a set of wide interdisciplinary perspectives, both how we shall construct them technically as well as their personal philosophical and social consequences. By Companions we mean conversationalists or confidants – not robots – but rather computer software agents whose function will be to get to know their owners over a long period. Those may well be elderly or lonely, and the contributions in the book focus not only on assistance via the internet (contacts, travel, doctors etc.) but also on providing company and Companionship, by offering aspects of real personalization.
04
09
01
https://benjamins.com/covers/475/nlp.8.png
04
03
01
https://benjamins.com/covers/475_jpg/9789027249944.jpg
04
03
01
https://benjamins.com/covers/475_tif/9789027249944.tif
06
09
01
https://benjamins.com/covers/1200_front/nlp.8.hb.png
07
09
01
https://benjamins.com/covers/125/nlp.8.png
25
09
01
https://benjamins.com/covers/1200_back/nlp.8.hb.png
27
09
01
https://benjamins.com/covers/3d_web/nlp.8.hb.png
10
01
JB code
nlp.8.00for
xi
xii
2
Miscellaneous
1
01
Foreword
10
01
JB code
nlp.8.02ack
xii
1
Miscellaneous
2
01
Acknowledgements
10
01
JB code
nlp.8.01contr
xiii
xxii
10
Miscellaneous
3
01
Contributors
10
01
JB code
nlp.8.02s1
Section header
4
01
<atl>Section I. Setting the scene
10
01
JB code
nlp.8.03tur
3
10
8
Article
5
01
In good company?
On the threshold of robotic Companions
1
A01
Sherry Turkle
Turkle, Sherry
Sherry
Turkle
01
Most contributors to this volume believe that only technical matters stand between where we are now and a time when robots will be our Companions and teachers. Robots need to expand their domains of understanding, and if those domains should be emotional, well, that will be a technical matter as well. So while this volume expresses designers’ enthusiasm about robots as technical objects, it challenges us to see robots as something more, as evocative objects. What are we thinking about when we are thinking about robots? We are thinking about aliveness and authenticity, love and spirituality. We are thinking about what it means to build a psychology. We are thinking about what makes people special. Or perhaps that they are not so special after all.
10
01
JB code
nlp.8.04wil
11
20
10
Article
6
01
Introducing artificial Companions
1
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
This introductory chapter, like the whole book itself, concerns a range of closely related topics: the possibility of machines having identifiable personalities, the possible future legal responsibilities of such companionable machines, and the design characteristics of such machines, including their technical implementation and the kinds of computational theory they will need to embody. As will become clear, I wish to explore these topics in terms of software entities, rather than robots, and in particular the sort of software agents now being encountered on the web, ranging at present from technical advisers to mere chatbots. I shall call them Companions. This introduction explores the following related aspects of a Companion in more detail: what kind and level of personality should be in a machine agent so as to be acceptable to a human user, more particularly to one who may fear technology and have no experience of it; and what levels of responsibility and legal attribution for responsibility can we expect from entities like complex web agents in the near future?
10
01
JB code
nlp.8.05s2
Section header
7
01
<atl>Section II. Ethical and philosophical issues
10
01
JB code
nlp.8.06flo
23
28
6
Article
8
01
Artificial Companions and their philosophical challenges
1
A01
Luciano Floridi
Floridi, Luciano
Luciano
Floridi
01
The technology for Artificial Companions is already largely available, and the question is when rather than whether ACs will become commodities (Benyon and Mival (2007). The difficulties are still formidable, but they are not insurmountable. On the contrary, they seem rather well-understood, and the path from theoretical problems to technical solutions looks steep but climbable. cIn the following pages, I wish to concentrate not on the technological challenges, which are important, but on some philosophical issues that a growing population of AC will make increasingly pressing.
10
01
JB code
nlp.8.07pul
29
34
6
Article
9
01
Conditions for Companionhood
1
A01
Stephen G. Pulman
Pulman, Stephen G.
Stephen G.
Pulman
01
This chapter is an attempt to outline a set of conditions that are jointly necessary and sufficient for any entity, virtual or real, to be regarded as displaying the properties characteristic of our ordinary everyday understanding of the notion of a Companion.
10
01
JB code
nlp.8.08oha
35
56
22
Article
10
01
Arius in cyberspace
Digital Companions and the limits of the person
1
A01
Kieron O'Hara
O'Hara, Kieron
Kieron
O'Hara
01
The relationship between a Companion and a person will become increasingly problematic as Companion technology improves and as models of users become increasingly sophisticated, and the simple dichotomies that make the Turing Test so plausible as a means of determining intelligence will become harder to maintain. As with any kind of content-storing technology, such as writing, or in more recent years laptops, the amount and quality of cognition that a human can ‘export’ to these outside technologies is significant. The ‘person’ or ‘agent’ can be seen as an extended system including the technologies as well as the human, in which the technologies, among other things, can help in the extension of trust towards the human.
10
01
JB code
nlp.8.09s3
Section header
11
01
Section III. Social and psychological issues
What should a Companion be like?
10
01
JB code
nlp.8.10bod
59
61
3
Article
12
01
Conversationalists and confidants
1
A01
Margaret A. Boden
Boden, Margaret A.
Margaret A.
Boden
01
If there were an abstract for this very short paper, it would be this: “Conversationalists, maybe – but confidants?”
10
01
JB code
nlp.8.11bry
63
74
12
Article
13
01
Robots should be slaves
1
A01
Joanna J. Bryson
Bryson, Joanna J.
Joanna J.
Bryson
01
Robots should not be described as persons, nor given legal nor moral responsibility for their actions. Robots are fully owned by us. We determine their goals and behavior, either directly or indirectly through specifying their intelligence or how their intelligence is acquired. In humanising them, we not only further dehumanise real people, but also encourage poor human decision making in the allocation of resources and responsibility. This is true at both the individual and the institutional level. This chapter describes both causes and consequences of these errors, including consequences already present in society. I make specific proposals for best incorporating robots into our society. The potential of robotics should be understood as the potential to extend our own abilities and to address our own goals.
10
01
JB code
nlp.8.12eva
75
88
14
Article
14
01
Wanting the impossible
The dilemma at the heart of intimate human-robot relationships
1
A01
Dylan Evans
Evans, Dylan
Dylan
Evans
01
In a recent book entitled Love and Sex with Robots, the British scholar David Levy has argued that relationships with robot Companions might be more satisfying than relationships with humans, a claim which I call “the greater satisfaction thesis” (GST).  The main reason Levy provides in support of GST is that people will be able to specify the features of robot Companions precisely in accordance with their wishes (which I call the total specification argument or TSA).  In this paper, I argue that TSA is wrong.  In particular, the argument breaks down when we consider certain behavioral characteristics that we desire in our partners. I illustrate my argument with a thought-experiment involving­ two kinds of robot – the FREEBOT, which is capable of rejecting its owner permanently, and the RELIABOT, which is not
10
01
JB code
nlp.8.13lev
89
94
6
Article
15
01
Falling in love with a Companion
1
A01
David Levy
Levy, David
David
Levy
01
In 1984, in her groundbreaking book The Second Self, Sherry Turkle made us aware of the tendency of some people to develop relationships with their computers. Turkle described one such example, an MIT computer hacker who she called Anthony, who had “tried out” having girlfriends but preferred to relate to computers. I believe that the developments in AI since then have demonstrated a progression in human-computer relationships to the point where we can now say with confidence that, in the foreseeable future, significant numbers of Anthony’s and their female counterparts will be falling in love with software Companions. This position paper summarizes my arguments.
10
01
JB code
nlp.8.14low
95
100
6
Article
16
01
Identifying your accompanist
1
A01
Will Lowe
Lowe, Will
Will
Lowe
01
What Companions are or should be will depend on what we want them to do for us. And this depends on what they can do for us. In the end, I shall argue that whether Companions should best be thought of, programmed and regulated as others, or as extensions of ourselves, is a tactical question. One path cwill simply be more effective than the other. Either way, it’s going to be all about us. To make a start let us switch to the second person: What can a Companion do for you?
10
01
JB code
nlp.8.15rom
101
106
6
Article
17
01
Look, emotion, language and behavior in a believable virtual Companion
1
A01
Daniela M. Romano
Romano, Daniela M.
Daniela M.
Romano
01
A good friend is the one that make you laugh, shares your deepest emotions and is there to listen and help you when you need it. Can we have we have a synthetic Companion with these qualities? He/she (we assume it has the status of a person at this stage) should be able understand and express emotions, to talk and listen, know who you are and what you feel and like. The Companion considered in this chapter is a synthetic creature living in a virtual world, able to display believable human-like qualities. The concept of believability within virtual environments is also discussed here together with some of the problems connected with the creation of such Companion.
10
01
JB code
nlp.8.16tay
107
120
14
Article
18
01
New Companions
1
A01
Alex Taylor
Taylor, Alex
Alex
Taylor
2
A01
Anab Jain
Jain, Anab
Anab
Jain
3
A01
Laurel Swan
Swan, Laurel
Laurel
Swan
01
This chapter draws on advancements in Artificial Intelligence (AI) and robotics to question the orthodoxy of artificial Companions research. Two areas of work are outlined and used to suggest artificial Companions need not be restricted to simulacrums of humans or animals. First, it is argued the developments in AI have given rise to the prospect of very different kinds of machines, machines that are unlike humans or animals but that we may still want to form relationships with. Second, details are presented of a project exploring energy autonomous robots. Ecobot, an example of such a robot, is shown to exhibit unique characteristics that may afford new, distinctive forms of Companionship. Finally, a design concept of an autonomously powered, household radio is presented to illustrate how these new kinds of relationships might be investigated further.
10
01
JB code
nlp.8.17wil
121
128
8
Article
19
01
On being a Victorian Companion
1
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
I have argued or suggested: – English Common Law already, in dogs, has a legal category of entities that are not human but are in some degree responsible for their actions and have “characters” that can be assessed. – Users may not want Companions prone to immediately expressed emotions and a restrained personality, like a Victorian Lady’s Companion, might provide a better model. – Language behavior is a complex repository of triggers for emotion, both expressed and causal, and this is often under-rated in the world of ECA and theories of emotion based on them. – Companion-to-Companion communications will be important and helpful to a user, and there is nothing in principle to make one believe that “secrets” cannot be handled sensitively in such an environment. – It is easy to underestimate the role of a user’s preference in selecting the personality appropriate to a Companion: it is not even clear that users want Companions to be polite or agreeable – it may depend on personal choice or their functional role. – For many it may be appropriate for a Companion to become progressively more like its owner in voice, face, personality, memories etc. – exaggerating the way dogs are believed to adapt to owners – and if and when this becomes possible, for the Companion to become a self-avatar of its owner, there may well be other unseen consequences after the owner’s death
10
01
JB code
nlp.8.18s4
Section header
20
01
Section IV. Design issues
Building a Companion
10
01
JB code
nlp.8.19bee
131
142
12
Article
21
01
The use of affective and attentive cues in an empathic computer-based Companions
The
use of affective and attentive cues in an empathic computer-based Companions
1
A01
Nikolaus Bee
Bee, Nikolaus
Nikolaus
Bee
2
A01
Elisabeth Andre
Andre, Elisabeth
Elisabeth
Andre
3
A01
Thurid Vogt
Vogt, Thurid
Thurid
Vogt
4
A01
Patrick Gebhard
Gebhard, Patrick
Patrick
Gebhard
01
Recently, a number of research projects have been started to create virtual agents that do not just serve as assistants to which tasks may be delegated, but that may even take on the role of a Companion. Such agents require a great deal of social intelligence, such as the ability to detect the user’s affective state and to respond to it in an empathic manner. The objective of our work is to create an empathetic listener that is capable to react on affective and attentive input cues of the user. In particular, we discuss various forms of empathy and how they may be realized based on these cues.
10
01
JB code
nlp.8.20bev
143
156
14
Article
22
01
GRETA
Towards an interactive conversational virtual Companion
1
A01
Elisabetta Bevacqua
Bevacqua, Elisabetta
Elisabetta
Bevacqua
2
A01
Ken Prepin
Prepin, Ken
Ken
Prepin
3
A01
Radoslaw Niewiadomski
Niewiadomski, Radoslaw
Radoslaw
Niewiadomski
4
A01
Etienne de Sevin
Sevin, Etienne de
Etienne de
Sevin
5
A01
Catherine Pelachaud
Pelachaud, Catherine
Catherine
Pelachaud
01
In this chapter we present our work toward building a conversational Companion. Conversing with partner(s) means to being able to express one’s mental and emotional state, to be a speaker or a listener. One needs also to adapt to ones partner’s reactions to what one is saying. We have developed an interactive ECA platform, Greta (Pelachaud, 2005). It is a 3D virtual agent capable of communicating expressive verbal and nonverbal behaviors as well as listening. It can use its gaze, facial expressions and gestures to convey a meaning, an attitude or an emotion. Multimodal behaviors are tightly tied with each other. A synchronization scheme has been elaborated allowing the agent to display a raised eyebrow or a beat gesture on a given word. According to its emotional or mental state, the agent may vary the quality of its behaviors: it may use a more or less extended gesture, the arms can move at different speeds and with different accelerations (Mancini & Pelachaud, 2008). The agent can also display listener behavior (Bevacqua et al., 2008). It interacts actively with users and/or other agents providing appropriate timed backchannels. Interaction also means the interactants ought to adapt to each others’ behaviors and dynamic coupling between them needs to be considered (Prepin & Revel, 2007).
10
01
JB code
nlp.8.21cat
157
168
12
Article
23
01
A world-hybrid approach to a conversational Companion for reminiscing about images
A
world-hybrid approach to a conversational Companion for reminiscing about images
1
A01
Roberta Catizone
Catizone, Roberta
Roberta
Catizone
2
A01
Simon F. Worgan
Worgan, Simon F.
Simon F.
Worgan
3
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
4
A01
Alexiei Dingli
Dingli, Alexiei
Alexiei
Dingli
5
A01
Weiwei Cheng
Cheng, Weiwei
Weiwei
Cheng
10
01
JB code
nlp.8.22cow
169
172
4
Article
24
01
Companionship is an emotional business
1
A01
Roddy Cowie
Cowie, Roddy
Roddy
Cowie
01
This is written from the perspective of someone who was trained as a psychologist, and has been working for a decade on emotion-oriented/affective computing. That background highlights two kinds of issue: how emotion enters into the Companion scenario, and how computing can relate to emotion. In both areas, there is a difference between the intuitions of people who are not deeply involved, and the realities as they appear to people working in the area. The goal of this paper is to consider how the realities of emotion and emotion-oriented technology impact on the prospects for artificial Companions. The concern behind it is that otherwise, we may misjudge both the prospects and the risks. In particular, ability to address the emotional side of Companionship may play a key part in acceptance; and the necessary resources, conceptual as well as technical, cannot be taken for granted. We should be concerned about inserting Companions into emotionally sensitive roles without engineering them to take that into account.
10
01
JB code
nlp.8.23new
173
178
6
Article
25
01
Artificial Companions in society
Consulting the users
1
A01
Alan Newell
Newell, Alan
Alan
Newell
01
Artificial Companions provide Companionship, and to help people with their daily tasks is an intriguing and exciting concept. In addition to the important technical challenges of providing such systems, it is also clearly important to consider: 1. What facilities they should and should not provide. 2. The characteristics, needs and wants of potential users and 3. The personal and social consequences of such technologies. These are complex, multi-faceted issues and it is necessary consider what are the most effective and beneficial ways of addressing them.
10
01
JB code
nlp.8.24slo
179
200
22
Article
26
01
Requirements for Artificial Companions
It’s harder than you think
1
A01
Aaron Sloman
Sloman, Aaron
Aaron
Sloman
01
Producing a system that meets plausible requirements for Artificial Companions (AC’s), without arbitrary restrictions, will involve solving a great many problems that are currently beyond the state of the art in Artificial Intelligence (AI); including problems that would arise in the design of robotic Companions helping an owner by performing practical tasks in the physical environment. In other words, even if the AC is not itself a robot and interacts with the user only via input devices such as camera, microphone, keyboard, mouse, touch-pad, and touch-screen, and output devices such as screen and audio output devices, nevertheless it will, in some circumstances, need the visual competences, the ontology, the representational resources, the reasoning competences, the planning competences, and the problem-solving competences that a helpful domestic robot would need. This is because some of the intended beneficiaries of ACs will need to be given advice about what physical actions to perform, what physical devices to acquire, and how to use such devices. I shall give examples illustrating the need for such competences.
10
01
JB code
nlp.8.25win
201
208
8
Article
27
01
You really need to know what your bot(s) are thinking about you
1
A01
Alan FT Winfield
Winfield, Alan FT
Alan FT
Winfield
01
The projected ubiquity of personal Companion robots raises a range of interesting but also challenging questions. There can be little doubt that an effective artificial Companion, whether embodied or not, will need to be both sensitive to the emotional state of its human partner and be able to respond sensitively. It will, in other words, need artificial theory of mind – such an artificial Companion would need to behave as if it has feelings and as if it understands how its human partner is feeling. This chapter explores the implementation and implications of artificial theory of mind, and raises concerns over the asymmetry between and artificial Companion’s theory of mind for its human partner and the human’s theory of mind for his or her artificial Companion. The essay argues that social learning (imitation) is an additional requirement of artificial Companion robots, then goes on to develop the idea that an artificial Companion robot will not be one robot but several. A surprising consequence of these ideas is that a family of artificial Companion robots could acquire an artificial culture of its own, and the essay concludes by speculating on what this might mean for human(s) interacting with their artificial Companion robots.
10
01
JB code
nlp.8.26s5
Section header
28
01
Section V. Special purpose Companions
10
01
JB code
nlp.8.27eyn
211
220
10
Article
29
01
A Companion for learning in everyday life
A
Companion for learning in everyday life
1
A01
Rebecca Eynon
Eynon, Rebecca
Rebecca
Eynon
2
A01
Chris Davies
Davies, Chris
Chris
Davies
01
The use of the Companion could have broad learning benefits. For example, cenabling learners to have control over their projects of learning could potentially lead to improved level of self efficacy beliefs (Kim and Balyor, 2006), cimproved IT skills or a general confidence to try new things. We believe that this broader view of the benefits of such a Companion makes the positive cimplications outweigh the negative.
10
01
JB code
nlp.8.28nir
221
244
24
Article
30
01
The Maryland virtual patient as a task-oriented conversational Companion
The
Maryland virtual patient as a task-oriented conversational Companion
1
A01
Sergei Nirenburg
Nirenburg, Sergei
Sergei
Nirenburg
01
This chapter describes a conversational agent environment, the Maryland Virtual Patient (MVP). MVP models the process of disease progression, diagnosis and treatment in virtual patients endowed with a “body,” a simulation of their physiological and pathological processes, and a “mind,” a set of capabilities of perception, reasoning and action. that allows the virtual patient to exhibit independent behavior, participate in a natural language dialog, remember events, hold beliefs about other agents and about specific object and event instances, make decisions and learn.
10
01
JB code
nlp.8.29sha
245
256
12
Article
31
01
Living with robots
Ethical tradeoffs in eldercare
1
A01
Noel Sharkey
Sharkey, Noel
Noel
Sharkey
2
A01
Amanda Sharkey
Sharkey, Amanda
Amanda
Sharkey
01
We discuss some of the research and ideas for developing robot carers and Companions for people with aging brains. We speculate a little about a near-future when it may be possible to keep people at home for longer in the almost exclusive care of robots and smart homes. We examine the benefits of robot care and Companionship and at the same time raise key ethical questions and concerns. We point to a series of trade-offs between the unethical and the beneficial that must be considered before robot care/Companionship becomes commonplace.
10
01
JB code
nlp.8.30s6
Section header
32
01
Section VI. Afterword
10
01
JB code
nlp.8.31pel
259
286
28
Article
33
01
Summary and discussion of the issues
1
A01
Malcom Peltu
Peltu, Malcom
Malcom
Peltu
2
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
The COMPANIONS project, which inspired this book, is studying conversational software-based artificial agents that will get to know their owners over a substantial period. These could be developed to advise, comfort and carry out a wide range of functions to support diverse personal and social needs, such as to be ‘artificial Companions’ for the elderly, helping their owners to learn, or assisting to sustain their owners’ fitness and health. This chapter summarizes the main issues raised in the workshop that gave rise to this book. Most direct quotes from participants in this chapter come from their own chapters. Appendix 1 contains examples of current artificial Companions and related research projects mentioned at the workshop.
10
01
JB code
nlp.8.32ref
287
308
22
Miscellaneous
34
01
References
10
01
JB code
nlp.8.33index
309
316
8
Miscellaneous
35
01
Index
02
JBENJAMINS
John Benjamins Publishing Company
01
John Benjamins Publishing Company
Amsterdam/Philadelphia
NL
04
20100324
2010
John Benjamins
02
WORLD
13
15
9789027249944
01
JB
3
John Benjamins e-Platform
03
jbe-platform.com
09
WORLD
21
01
00
99.00
EUR
R
01
00
83.00
GBP
Z
01
gen
00
149.00
USD
S
526007631
03
01
01
JB
John Benjamins Publishing Company
01
JB code
NLP 8 Hb
15
9789027249944
13
2009048316
BB
01
NLP
02
1567-8202
Natural Language Processing
8
01
Close Engagements with Artificial Companions
Key social, psychological, ethical and design issues
01
nlp.8
01
https://benjamins.com
02
https://benjamins.com/catalog/nlp.8
1
B01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
University of Oxford
01
eng
340
xxii
315
COM004000
v.2006
UYZ
2
24
JB Subject Scheme
CONS.GEN
Consciousness research
24
JB Subject Scheme
LIN.COMPUT
Computational & corpus linguistics
24
JB Subject Scheme
PHIL.GEN
Philosophy
06
01
What will it be like to admit Artificial Companions into our society? How will they change our relations with each other? How important will they be in the emotional and practical lives of their owners – since we know that people became emotionally dependent even on simple devices like the Tamagotchi? How much social life might they have in contacting each other? The contributors to this book discuss the possibility and desirability of some form of long-term computer Companions now being a certainty in the coming years. It is a good moment to consider, from a set of wide interdisciplinary perspectives, both how we shall construct them technically as well as their personal philosophical and social consequences. By Companions we mean conversationalists or confidants – not robots – but rather computer software agents whose function will be to get to know their owners over a long period. Those may well be elderly or lonely, and the contributions in the book focus not only on assistance via the internet (contacts, travel, doctors etc.) but also on providing company and Companionship, by offering aspects of real personalization.
04
09
01
https://benjamins.com/covers/475/nlp.8.png
04
03
01
https://benjamins.com/covers/475_jpg/9789027249944.jpg
04
03
01
https://benjamins.com/covers/475_tif/9789027249944.tif
06
09
01
https://benjamins.com/covers/1200_front/nlp.8.hb.png
07
09
01
https://benjamins.com/covers/125/nlp.8.png
25
09
01
https://benjamins.com/covers/1200_back/nlp.8.hb.png
27
09
01
https://benjamins.com/covers/3d_web/nlp.8.hb.png
10
01
JB code
nlp.8.00for
xi
xii
2
Miscellaneous
1
01
Foreword
10
01
JB code
nlp.8.02ack
xii
1
Miscellaneous
2
01
Acknowledgements
10
01
JB code
nlp.8.01contr
xiii
xxii
10
Miscellaneous
3
01
Contributors
10
01
JB code
nlp.8.02s1
Section header
4
01
<atl>Section I. Setting the scene
10
01
JB code
nlp.8.03tur
3
10
8
Article
5
01
In good company?
On the threshold of robotic Companions
1
A01
Sherry Turkle
Turkle, Sherry
Sherry
Turkle
01
Most contributors to this volume believe that only technical matters stand between where we are now and a time when robots will be our Companions and teachers. Robots need to expand their domains of understanding, and if those domains should be emotional, well, that will be a technical matter as well. So while this volume expresses designers’ enthusiasm about robots as technical objects, it challenges us to see robots as something more, as evocative objects. What are we thinking about when we are thinking about robots? We are thinking about aliveness and authenticity, love and spirituality. We are thinking about what it means to build a psychology. We are thinking about what makes people special. Or perhaps that they are not so special after all.
10
01
JB code
nlp.8.04wil
11
20
10
Article
6
01
Introducing artificial Companions
1
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
This introductory chapter, like the whole book itself, concerns a range of closely related topics: the possibility of machines having identifiable personalities, the possible future legal responsibilities of such companionable machines, and the design characteristics of such machines, including their technical implementation and the kinds of computational theory they will need to embody. As will become clear, I wish to explore these topics in terms of software entities, rather than robots, and in particular the sort of software agents now being encountered on the web, ranging at present from technical advisers to mere chatbots. I shall call them Companions. This introduction explores the following related aspects of a Companion in more detail: what kind and level of personality should be in a machine agent so as to be acceptable to a human user, more particularly to one who may fear technology and have no experience of it; and what levels of responsibility and legal attribution for responsibility can we expect from entities like complex web agents in the near future?
10
01
JB code
nlp.8.05s2
Section header
7
01
<atl>Section II. Ethical and philosophical issues
10
01
JB code
nlp.8.06flo
23
28
6
Article
8
01
Artificial Companions and their philosophical challenges
1
A01
Luciano Floridi
Floridi, Luciano
Luciano
Floridi
01
The technology for Artificial Companions is already largely available, and the question is when rather than whether ACs will become commodities (Benyon and Mival (2007). The difficulties are still formidable, but they are not insurmountable. On the contrary, they seem rather well-understood, and the path from theoretical problems to technical solutions looks steep but climbable. cIn the following pages, I wish to concentrate not on the technological challenges, which are important, but on some philosophical issues that a growing population of AC will make increasingly pressing.
10
01
JB code
nlp.8.07pul
29
34
6
Article
9
01
Conditions for Companionhood
1
A01
Stephen G. Pulman
Pulman, Stephen G.
Stephen G.
Pulman
01
This chapter is an attempt to outline a set of conditions that are jointly necessary and sufficient for any entity, virtual or real, to be regarded as displaying the properties characteristic of our ordinary everyday understanding of the notion of a Companion.
10
01
JB code
nlp.8.08oha
35
56
22
Article
10
01
Arius in cyberspace
Digital Companions and the limits of the person
1
A01
Kieron O'Hara
O'Hara, Kieron
Kieron
O'Hara
01
The relationship between a Companion and a person will become increasingly problematic as Companion technology improves and as models of users become increasingly sophisticated, and the simple dichotomies that make the Turing Test so plausible as a means of determining intelligence will become harder to maintain. As with any kind of content-storing technology, such as writing, or in more recent years laptops, the amount and quality of cognition that a human can ‘export’ to these outside technologies is significant. The ‘person’ or ‘agent’ can be seen as an extended system including the technologies as well as the human, in which the technologies, among other things, can help in the extension of trust towards the human.
10
01
JB code
nlp.8.09s3
Section header
11
01
Section III. Social and psychological issues
What should a Companion be like?
10
01
JB code
nlp.8.10bod
59
61
3
Article
12
01
Conversationalists and confidants
1
A01
Margaret A. Boden
Boden, Margaret A.
Margaret A.
Boden
01
If there were an abstract for this very short paper, it would be this: “Conversationalists, maybe – but confidants?”
10
01
JB code
nlp.8.11bry
63
74
12
Article
13
01
Robots should be slaves
1
A01
Joanna J. Bryson
Bryson, Joanna J.
Joanna J.
Bryson
01
Robots should not be described as persons, nor given legal nor moral responsibility for their actions. Robots are fully owned by us. We determine their goals and behavior, either directly or indirectly through specifying their intelligence or how their intelligence is acquired. In humanising them, we not only further dehumanise real people, but also encourage poor human decision making in the allocation of resources and responsibility. This is true at both the individual and the institutional level. This chapter describes both causes and consequences of these errors, including consequences already present in society. I make specific proposals for best incorporating robots into our society. The potential of robotics should be understood as the potential to extend our own abilities and to address our own goals.
10
01
JB code
nlp.8.12eva
75
88
14
Article
14
01
Wanting the impossible
The dilemma at the heart of intimate human-robot relationships
1
A01
Dylan Evans
Evans, Dylan
Dylan
Evans
01
In a recent book entitled Love and Sex with Robots, the British scholar David Levy has argued that relationships with robot Companions might be more satisfying than relationships with humans, a claim which I call “the greater satisfaction thesis” (GST).  The main reason Levy provides in support of GST is that people will be able to specify the features of robot Companions precisely in accordance with their wishes (which I call the total specification argument or TSA).  In this paper, I argue that TSA is wrong.  In particular, the argument breaks down when we consider certain behavioral characteristics that we desire in our partners. I illustrate my argument with a thought-experiment involving­ two kinds of robot – the FREEBOT, which is capable of rejecting its owner permanently, and the RELIABOT, which is not
10
01
JB code
nlp.8.13lev
89
94
6
Article
15
01
Falling in love with a Companion
1
A01
David Levy
Levy, David
David
Levy
01
In 1984, in her groundbreaking book The Second Self, Sherry Turkle made us aware of the tendency of some people to develop relationships with their computers. Turkle described one such example, an MIT computer hacker who she called Anthony, who had “tried out” having girlfriends but preferred to relate to computers. I believe that the developments in AI since then have demonstrated a progression in human-computer relationships to the point where we can now say with confidence that, in the foreseeable future, significant numbers of Anthony’s and their female counterparts will be falling in love with software Companions. This position paper summarizes my arguments.
10
01
JB code
nlp.8.14low
95
100
6
Article
16
01
Identifying your accompanist
1
A01
Will Lowe
Lowe, Will
Will
Lowe
01
What Companions are or should be will depend on what we want them to do for us. And this depends on what they can do for us. In the end, I shall argue that whether Companions should best be thought of, programmed and regulated as others, or as extensions of ourselves, is a tactical question. One path cwill simply be more effective than the other. Either way, it’s going to be all about us. To make a start let us switch to the second person: What can a Companion do for you?
10
01
JB code
nlp.8.15rom
101
106
6
Article
17
01
Look, emotion, language and behavior in a believable virtual Companion
1
A01
Daniela M. Romano
Romano, Daniela M.
Daniela M.
Romano
01
A good friend is the one that make you laugh, shares your deepest emotions and is there to listen and help you when you need it. Can we have we have a synthetic Companion with these qualities? He/she (we assume it has the status of a person at this stage) should be able understand and express emotions, to talk and listen, know who you are and what you feel and like. The Companion considered in this chapter is a synthetic creature living in a virtual world, able to display believable human-like qualities. The concept of believability within virtual environments is also discussed here together with some of the problems connected with the creation of such Companion.
10
01
JB code
nlp.8.16tay
107
120
14
Article
18
01
New Companions
1
A01
Alex Taylor
Taylor, Alex
Alex
Taylor
2
A01
Anab Jain
Jain, Anab
Anab
Jain
3
A01
Laurel Swan
Swan, Laurel
Laurel
Swan
01
This chapter draws on advancements in Artificial Intelligence (AI) and robotics to question the orthodoxy of artificial Companions research. Two areas of work are outlined and used to suggest artificial Companions need not be restricted to simulacrums of humans or animals. First, it is argued the developments in AI have given rise to the prospect of very different kinds of machines, machines that are unlike humans or animals but that we may still want to form relationships with. Second, details are presented of a project exploring energy autonomous robots. Ecobot, an example of such a robot, is shown to exhibit unique characteristics that may afford new, distinctive forms of Companionship. Finally, a design concept of an autonomously powered, household radio is presented to illustrate how these new kinds of relationships might be investigated further.
10
01
JB code
nlp.8.17wil
121
128
8
Article
19
01
On being a Victorian Companion
1
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
I have argued or suggested: – English Common Law already, in dogs, has a legal category of entities that are not human but are in some degree responsible for their actions and have “characters” that can be assessed. – Users may not want Companions prone to immediately expressed emotions and a restrained personality, like a Victorian Lady’s Companion, might provide a better model. – Language behavior is a complex repository of triggers for emotion, both expressed and causal, and this is often under-rated in the world of ECA and theories of emotion based on them. – Companion-to-Companion communications will be important and helpful to a user, and there is nothing in principle to make one believe that “secrets” cannot be handled sensitively in such an environment. – It is easy to underestimate the role of a user’s preference in selecting the personality appropriate to a Companion: it is not even clear that users want Companions to be polite or agreeable – it may depend on personal choice or their functional role. – For many it may be appropriate for a Companion to become progressively more like its owner in voice, face, personality, memories etc. – exaggerating the way dogs are believed to adapt to owners – and if and when this becomes possible, for the Companion to become a self-avatar of its owner, there may well be other unseen consequences after the owner’s death
10
01
JB code
nlp.8.18s4
Section header
20
01
Section IV. Design issues
Building a Companion
10
01
JB code
nlp.8.19bee
131
142
12
Article
21
01
The use of affective and attentive cues in an empathic computer-based Companions
The
use of affective and attentive cues in an empathic computer-based Companions
1
A01
Nikolaus Bee
Bee, Nikolaus
Nikolaus
Bee
2
A01
Elisabeth Andre
Andre, Elisabeth
Elisabeth
Andre
3
A01
Thurid Vogt
Vogt, Thurid
Thurid
Vogt
4
A01
Patrick Gebhard
Gebhard, Patrick
Patrick
Gebhard
01
Recently, a number of research projects have been started to create virtual agents that do not just serve as assistants to which tasks may be delegated, but that may even take on the role of a Companion. Such agents require a great deal of social intelligence, such as the ability to detect the user’s affective state and to respond to it in an empathic manner. The objective of our work is to create an empathetic listener that is capable to react on affective and attentive input cues of the user. In particular, we discuss various forms of empathy and how they may be realized based on these cues.
10
01
JB code
nlp.8.20bev
143
156
14
Article
22
01
GRETA
Towards an interactive conversational virtual Companion
1
A01
Elisabetta Bevacqua
Bevacqua, Elisabetta
Elisabetta
Bevacqua
2
A01
Ken Prepin
Prepin, Ken
Ken
Prepin
3
A01
Radoslaw Niewiadomski
Niewiadomski, Radoslaw
Radoslaw
Niewiadomski
4
A01
Etienne de Sevin
Sevin, Etienne de
Etienne de
Sevin
5
A01
Catherine Pelachaud
Pelachaud, Catherine
Catherine
Pelachaud
01
In this chapter we present our work toward building a conversational Companion. Conversing with partner(s) means to being able to express one’s mental and emotional state, to be a speaker or a listener. One needs also to adapt to ones partner’s reactions to what one is saying. We have developed an interactive ECA platform, Greta (Pelachaud, 2005). It is a 3D virtual agent capable of communicating expressive verbal and nonverbal behaviors as well as listening. It can use its gaze, facial expressions and gestures to convey a meaning, an attitude or an emotion. Multimodal behaviors are tightly tied with each other. A synchronization scheme has been elaborated allowing the agent to display a raised eyebrow or a beat gesture on a given word. According to its emotional or mental state, the agent may vary the quality of its behaviors: it may use a more or less extended gesture, the arms can move at different speeds and with different accelerations (Mancini & Pelachaud, 2008). The agent can also display listener behavior (Bevacqua et al., 2008). It interacts actively with users and/or other agents providing appropriate timed backchannels. Interaction also means the interactants ought to adapt to each others’ behaviors and dynamic coupling between them needs to be considered (Prepin & Revel, 2007).
10
01
JB code
nlp.8.21cat
157
168
12
Article
23
01
A world-hybrid approach to a conversational Companion for reminiscing about images
A
world-hybrid approach to a conversational Companion for reminiscing about images
1
A01
Roberta Catizone
Catizone, Roberta
Roberta
Catizone
2
A01
Simon F. Worgan
Worgan, Simon F.
Simon F.
Worgan
3
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
4
A01
Alexiei Dingli
Dingli, Alexiei
Alexiei
Dingli
5
A01
Weiwei Cheng
Cheng, Weiwei
Weiwei
Cheng
10
01
JB code
nlp.8.22cow
169
172
4
Article
24
01
Companionship is an emotional business
1
A01
Roddy Cowie
Cowie, Roddy
Roddy
Cowie
01
This is written from the perspective of someone who was trained as a psychologist, and has been working for a decade on emotion-oriented/affective computing. That background highlights two kinds of issue: how emotion enters into the Companion scenario, and how computing can relate to emotion. In both areas, there is a difference between the intuitions of people who are not deeply involved, and the realities as they appear to people working in the area. The goal of this paper is to consider how the realities of emotion and emotion-oriented technology impact on the prospects for artificial Companions. The concern behind it is that otherwise, we may misjudge both the prospects and the risks. In particular, ability to address the emotional side of Companionship may play a key part in acceptance; and the necessary resources, conceptual as well as technical, cannot be taken for granted. We should be concerned about inserting Companions into emotionally sensitive roles without engineering them to take that into account.
10
01
JB code
nlp.8.23new
173
178
6
Article
25
01
Artificial Companions in society
Consulting the users
1
A01
Alan Newell
Newell, Alan
Alan
Newell
01
Artificial Companions provide Companionship, and to help people with their daily tasks is an intriguing and exciting concept. In addition to the important technical challenges of providing such systems, it is also clearly important to consider: 1. What facilities they should and should not provide. 2. The characteristics, needs and wants of potential users and 3. The personal and social consequences of such technologies. These are complex, multi-faceted issues and it is necessary consider what are the most effective and beneficial ways of addressing them.
10
01
JB code
nlp.8.24slo
179
200
22
Article
26
01
Requirements for Artificial Companions
It’s harder than you think
1
A01
Aaron Sloman
Sloman, Aaron
Aaron
Sloman
01
Producing a system that meets plausible requirements for Artificial Companions (AC’s), without arbitrary restrictions, will involve solving a great many problems that are currently beyond the state of the art in Artificial Intelligence (AI); including problems that would arise in the design of robotic Companions helping an owner by performing practical tasks in the physical environment. In other words, even if the AC is not itself a robot and interacts with the user only via input devices such as camera, microphone, keyboard, mouse, touch-pad, and touch-screen, and output devices such as screen and audio output devices, nevertheless it will, in some circumstances, need the visual competences, the ontology, the representational resources, the reasoning competences, the planning competences, and the problem-solving competences that a helpful domestic robot would need. This is because some of the intended beneficiaries of ACs will need to be given advice about what physical actions to perform, what physical devices to acquire, and how to use such devices. I shall give examples illustrating the need for such competences.
10
01
JB code
nlp.8.25win
201
208
8
Article
27
01
You really need to know what your bot(s) are thinking about you
1
A01
Alan FT Winfield
Winfield, Alan FT
Alan FT
Winfield
01
The projected ubiquity of personal Companion robots raises a range of interesting but also challenging questions. There can be little doubt that an effective artificial Companion, whether embodied or not, will need to be both sensitive to the emotional state of its human partner and be able to respond sensitively. It will, in other words, need artificial theory of mind – such an artificial Companion would need to behave as if it has feelings and as if it understands how its human partner is feeling. This chapter explores the implementation and implications of artificial theory of mind, and raises concerns over the asymmetry between and artificial Companion’s theory of mind for its human partner and the human’s theory of mind for his or her artificial Companion. The essay argues that social learning (imitation) is an additional requirement of artificial Companion robots, then goes on to develop the idea that an artificial Companion robot will not be one robot but several. A surprising consequence of these ideas is that a family of artificial Companion robots could acquire an artificial culture of its own, and the essay concludes by speculating on what this might mean for human(s) interacting with their artificial Companion robots.
10
01
JB code
nlp.8.26s5
Section header
28
01
Section V. Special purpose Companions
10
01
JB code
nlp.8.27eyn
211
220
10
Article
29
01
A Companion for learning in everyday life
A
Companion for learning in everyday life
1
A01
Rebecca Eynon
Eynon, Rebecca
Rebecca
Eynon
2
A01
Chris Davies
Davies, Chris
Chris
Davies
01
The use of the Companion could have broad learning benefits. For example, cenabling learners to have control over their projects of learning could potentially lead to improved level of self efficacy beliefs (Kim and Balyor, 2006), cimproved IT skills or a general confidence to try new things. We believe that this broader view of the benefits of such a Companion makes the positive cimplications outweigh the negative.
10
01
JB code
nlp.8.28nir
221
244
24
Article
30
01
The Maryland virtual patient as a task-oriented conversational Companion
The
Maryland virtual patient as a task-oriented conversational Companion
1
A01
Sergei Nirenburg
Nirenburg, Sergei
Sergei
Nirenburg
01
This chapter describes a conversational agent environment, the Maryland Virtual Patient (MVP). MVP models the process of disease progression, diagnosis and treatment in virtual patients endowed with a “body,” a simulation of their physiological and pathological processes, and a “mind,” a set of capabilities of perception, reasoning and action. that allows the virtual patient to exhibit independent behavior, participate in a natural language dialog, remember events, hold beliefs about other agents and about specific object and event instances, make decisions and learn.
10
01
JB code
nlp.8.29sha
245
256
12
Article
31
01
Living with robots
Ethical tradeoffs in eldercare
1
A01
Noel Sharkey
Sharkey, Noel
Noel
Sharkey
2
A01
Amanda Sharkey
Sharkey, Amanda
Amanda
Sharkey
01
We discuss some of the research and ideas for developing robot carers and Companions for people with aging brains. We speculate a little about a near-future when it may be possible to keep people at home for longer in the almost exclusive care of robots and smart homes. We examine the benefits of robot care and Companionship and at the same time raise key ethical questions and concerns. We point to a series of trade-offs between the unethical and the beneficial that must be considered before robot care/Companionship becomes commonplace.
10
01
JB code
nlp.8.30s6
Section header
32
01
Section VI. Afterword
10
01
JB code
nlp.8.31pel
259
286
28
Article
33
01
Summary and discussion of the issues
1
A01
Malcom Peltu
Peltu, Malcom
Malcom
Peltu
2
A01
Yorick Wilks
Wilks, Yorick
Yorick
Wilks
01
The COMPANIONS project, which inspired this book, is studying conversational software-based artificial agents that will get to know their owners over a substantial period. These could be developed to advise, comfort and carry out a wide range of functions to support diverse personal and social needs, such as to be ‘artificial Companions’ for the elderly, helping their owners to learn, or assisting to sustain their owners’ fitness and health. This chapter summarizes the main issues raised in the workshop that gave rise to this book. Most direct quotes from participants in this chapter come from their own chapters. Appendix 1 contains examples of current artificial Companions and related research projects mentioned at the workshop.
10
01
JB code
nlp.8.32ref
287
308
22
Miscellaneous
34
01
References
10
01
JB code
nlp.8.33index
309
316
8
Miscellaneous
35
01
Index
02
JBENJAMINS
John Benjamins Publishing Company
01
John Benjamins Publishing Company
Amsterdam/Philadelphia
NL
04
20100324
2010
John Benjamins
02
WORLD
01
245
mm
02
164
mm
08
760
gr
01
JB
1
John Benjamins Publishing Company
+31 20 6304747
+31 20 6739773
bookorder@benjamins.nl
01
https://benjamins.com
01
WORLD
US CA MX
21
6
18
01
02
JB
1
00
99.00
EUR
R
02
02
JB
1
00
104.94
EUR
R
01
JB
10
bebc
+44 1202 712 934
+44 1202 712 913
sales@bebc.co.uk
03
GB
21
18
02
02
JB
1
00
83.00
GBP
Z
01
JB
2
John Benjamins North America
+1 800 562-5666
+1 703 661-1501
benjamins@presswarehouse.com
01
https://benjamins.com
01
US CA MX
21
1
18
01
gen
02
JB
1
00
149.00
USD