Picard explains the need to monitoring emotional cues and how this is present with humans when she states:
"Whatever his strategy, the good teacher detects important affective cues from the student and responds differently because of them. For example, the teacher might leave subtle hints or clues for the student to discover, thereby preserving the learner's sense of self-propelled discovery. Whether the subject matter involves deliberate emotional expression as is the case with music, or is a "non-emotional" topic such as science, the teacher that attends to a student's interest, pleasure, and distress is perceived as more effective than the teacher that proceeds callously. The best teachers know that frustration usually precedes quitting, and know how to redirect or motivate the pupil at such times. They get to know their student, including how much distress that student can withstand before learning breaks down."
But such emotional cues are not part of robotic intelligence. In order to portray how such a recognition would alter interactions with robots, Picard gave an example situation:
"Imagine your robot entering the kitchen as you prepare breakfast for guests. The robot looks happy to see you and greets you with a cheery "Good morning." You mumble something it does not understand. It notices your face, vocal tone, smoke above the stove, and your slamming of a pot into the sink, and infers that you do not appear to be having a good morning. Immediately, it adjusts its internal state to "subdued," which has the effect of lowering its vocal pitch and amplitude settings, eliminating cheery behavioral displays, and suppressing unnecessary conversation. Suppose you exclaim, "Ow!!" yanking your hand from the hot stove, rushing to run your fingers under cold water, adding "I can't believe I ruined the sauce." While the robot's speech recognition may not have high confidence that it accurately recognized all of your words, its assessment of your affect and actions indicates a high probability that you are upset and maybe hurt".
In such a situation, it is necessary for the robots to understand the emotional aspects of humans in order to better serve their intended purpose.
Her work has influenced many fields beyond computer science, ranging from video games to law.One critic, Aaron Sloman, described the book as having a "bold vision" that will inspire some and irritate others. Other critics emphasize the importance behind the work as it establishes an important framework for the field as a whole. Picard responded to Sloman's review by saying, "I don’t think the review captures the flavor of the book. However, he does raise interesting points, as well as potential misunderstandings, both of which I am grateful for the opportunity to comment on."