Department of Education and Science, Russian Federation Perm State University FUNDAMENTALS OF MATHEMATICAL THEORY OF EMOTIONAL ROBOTS MONOGRAPH Oleg G. Pensky, Kirill V. Chernikov 2010 Perm, RUSSIA 2 Abstract In this book we introduce a mathematically formalized concept of emotion, robot’s education and other psychological parameters of intelligent robots. We also introduce unitless coefficients characterizing an emotional memory of a robot. Besides, the effect of a robot’s me mory upon its emotional behavior is studied, and theorems defining fellowship and conflicts in groups of robots are proved. Also unitless parameters describing emotional states of those groups are introduced, and a rule of making alternative (binary) decis ions based on emotional selection is given. We introduce a concept of equivalent educational process for robots and a concept of efficiency coefficient of an educational process, and suggest an algorithm of emotional contacts within a group of robots. And generally, we present and describe a model of a virtual reality with emotional robots. The book is meant for mathematical modeling specialists and emotional robot software developers. Translated from Russian by Julia Yu. Plotnikova © Pensky O.G., Chernikov K.V. 2010 3 C ONTENTS Introduction 5 1. Robot’s emotion: definition 7 2. Education of a robot 12 3. Parameters of a group of emotional robots 22 4. Friendship between robots: fellowship (concorda nce) 24 5. Equivalent educational processes 26 5.1. Mathematical model of equivalent education processes 27 5.2. Alternative to an objective function under coi ncidence of time steps of real and equivalent education processes 29 5.3. Generalization in case of noncoincidence of time steps of real and equivalent education processes 33 6. Method of approximate definition of memory coefficient function 34 7. Mathematical model of forming of tantamount robot sub - groups 35 8. Algorithm for forming tantamount sub - groups of robot 38 9. Applying vector algebra rules to investigation of robot sub - group emotional state 39 10. Mathematical assessment of goal achievement extent 43 10.1. Rule of solving for the extent of goal achievement 43 10.2. Algorithm for forming tantamount sub - groups of robots according to their goal achievement extent 47 11. Mathematical model of robot’s emotional abilities 48 12. Work and willpower of emotional robots 51 13. Robot’s temperament model 54 14. Investigation of psychological process dynamics in a group of robots 55 15. Rules and forecast of emotion al selection of robots 58 16. Generalization of robot’s emotional behavior rules in case the number of players interacting with the robot is arbitrary (not specified) 64 16.1. First rule of alternate selection 64 16.2. Second rule of alternate selection 65 16.3. Orthogonality of education vectors and equivalence of alternate selection rules 65 17. Emotional selection and conflicts between robots 66 18. Diagnostics of emotional robots’ “mental diseases” 67 19. Models of robot’s ambivalent emotions 69 20. Absolute memory of robots 72 21. Algorithm of emotional contacts in a group of robots 7 7 22. On information aspects of E - creatures 79 23. Software realization of simple e motional robot’s behavior 82 23.1 Input parameters of software 82 23.2. Algorithm for modeling robot’s mimic emotional reaction 82 4 23.3. SoundBot software architecture 84 23.4. Main features of SoundBot 8 6 23.5. SoundBot operation principles 86 23.6. SoundBot visual interface 8 8 Conclusion 91 References 92 5 INTRODUCTION Emotions represent an essential part of human and animal psychological activity. Attempts to formalize mathematically the psychological behavior of higher living beings were performed in a book « Гипотезы и алгоритмы математической теории исчисления эмоций » (“Hypotheses and Algorithms of Emotion Calculus Mathematical Theory”) edited by Professor Oleg G. Pensky and published by the Perm State University (Russia) in 2009. Although the authors wanted this treatise to be considered as an example of some scientific quest, it encountered strong misunderstanding of psychologists in the city of Perm. That book suggested mathematical models introducing and applying such terms and concepts as ‘emotional educa tion/upbringing’, ‘reeducation’, ‘temperament’, ‘conflict’, etc.; also the authors reviewed approaches to modeling of emotional behavior of subjects, estimation of a psychological state in groups; there was as well suggested a new approach to the descripti on of some new economic phenomena based on psychological theories. The authors of the present paper completely agree that computer modeling of emotions is hindered by ambiguity of living being emotional behavior. Considering misunderstanding of psychologis ts, Professor Pensky decided to adapt the results of his studies performed in 2009 to mathematical modeling of emotional robots and give a further development to those ideas. The treatise of professor Oleg G. Pensky titled “Mathematical Models of Emotional Robots” was issued by the Perm State University printing office in 2010. In the present book, same as in that one issued in 2010, the authors made an attempt to create and mathematically describe a virtual reality of emotional robots, which is based on s uch key terms as emotions and education, and includes fellowship/concordance and conflicts between its inhabitants – robots which feature various abilities, temperaments, memory, will - power, emotional work under achieving goals, ‘diseases’, education process prospects and corresponding concepts and terms. Currently the American scientists [1] work on creation of an electronic copy of a human being which would be called E - creature. By happy chance the present book touches upon those very topics which are curre ntly studied by our American colleagues. We consider robots with a non - absolute memory, and this kind of memory is a human being’s feature. Of course, the mathematical theory of emotional robots which we call your attention to in this book is far from perf ection. But its authors never meant that this theory claims to be global, and once again ask critics above all to consider this book as an example of a scientific quest. Acknowledgements The authors are eternally indebted to Alexander Bolonkin, PhD, Profe ssor of NJIT for having the book discussed, for the description of E - creature information 6 modeling problems, for his guidance in advancing and presenting our theory of emotional robots to the scientific community. The authors highly appreciate useful notes concerning the content of this book made by Tatiana S. Belozerova, PhD (Russia). 7 1. ROBOT’s EMOTION: DEFINITION A theory of human psychology defines emotions as an organism response to some stimulus [2]. Concern ing robots, let us designate this stimulus as ‘ subject’ , and define it as follows: Let t be a time. Definition 1.1 . The function S(t) is referred as a ‘subject’ if it has the following properties: 1. Function domain of S(t):   0 , , 0 * *   t t t ,   * t ; 2. S(t) >0 for any   * , 0 t t  ; 3. S(t) is the one - to - one function; 4. S(t) is the bounded function. The paper [3] contains a theorem proving that it is possible for computer software to model human and animal emotions. Bu t psychological features of living beings’ emotions are so intricate and ambiguous that we decided to introduce a special mathematical definition of a robot’s emotion. In this definition we are abstracting from real human emotions and, at the same time, ac cumulating general features of human and animal emotions; we are also abstracting from the content of emotions. Definition 1.2 . The function f(t) , satisfying the equation ) ( ) ), ( ( ) ( t S t t S a t f  (with a(s(t),t) the arbitrary function) is the function of r obot’s inner emotional experience. Let us state that the subject S(t) initiates robot’s inner emotional experience. Definition 1.3 . The robot’s inner emotional experience function M(t) is called an ‘emotion’ if it satisfies the following conditions: 1. Function domain of M(t) :   0 , , 0 0 0   t t t ; 2. * 0 t t  (note that this condition is equivalent to emotion termination in case the subject effect is either over or not over yet); 3. M(t) is the single - valued function; 4. 0 ) 0 (  M ; 5. 0 ) ( 0  t M ; 6. M(t) is the constant - sign function; 7. There is the derivative dt t M d ) ( within the function domain; 8 8. There is the only point z within the function domain, such that 0 , 0 t z z   and 0 ) ( /   z t dt t M d ; 9. 0 ) (  dt t M d with z t  ; 10. 0 ) (  dt t M d with z t  . Let us assume there is such J>0 that for any emotions of a robot the condition J t M  ) ( is valid. Now we ca n easily see that the function          t t P t M 0 sin ) (  for   0 , 0 t t  , const P  , is an emotion. Definition 1.4 . The function ) ( t M  is called an ambivalent emotion if it can be presented as the vector which ele ments are emotions initiated simultaneously by one and the same subject. We will not focus on the content of emotions, and, according to [4], below we plan to take into account only the following things important to us: 1. Emotions have a sign (plus or minus) . 2. An object has a finite number of emotions. Based on (2) we conclude that the robot’s emotional state can be described by the emotion vector ) ( t M  with the finite number of elements (cardinality) equal to n :   ) ( ),..., ( ) ( 1 t M t M t M n   . Hereinafter, in case we speak about a single - type emotion, we will omit the corresponding index mark, vector mark and will denote this by M(t ). Assume the emotion - free state of a robot as a zero emotion level. It is obvious, that stimuli can be totally external, partially external (or ‘partially memorized’) and internal. All of them may become a subject: - totally external stimuli (which are not contained in the robot’s memory (see Fig. 1.1)), may serve as a subject; - ‘partially memorized’ stimuli (when some part of information about them is entered into the robot’s memory, and some part of it comes from the outside as external experience (Fig. 2.2)) may also serve as a subject; - internal stimuli (when full informatio n about these stimuli is kept in the robot’s memory (Fig. 1.3)) may serve as a subject, as well. This is the case when, e.g., some recollection (past event memories) of a robot may generate emotions. 9 Robot’s Memory Subject Fig. 1.1. Totally external stimuli as a subject Robot’s Memory Subject Fig. 1.2. Partially external stimuli as a subject 10 Robot’s Memory Subject Fig. 1.3. Internal stimuli as a subject Fig. 1.2 and Fig. 1.3 partially correspond with the psychological theory of S. Schechter [4]. According to Schechter, the occurred emotional state of a person is effec ted by his/her previous experience and his/her assessment of the current situation, as well as by perceived stimuli and stimulus - initiated physical alterations. Let us note, that when describing a subject and its belonging to the robot’s memory we used the term ‘information’ which is measured in bits [5]. So, let us advance the following hypothesis: a subject can be measured in bits of information as well. It is obvious, that different subjects can initiate one and the same emotion of a robot, i.e. there is no one - to - one dependence between a subject and an emotion (Fig. 1.4). 11 Emotion Subjects Fig. 1.4. Relation between Subjects and Emotion. And also, one and the same subject can in itiate different emotions of a robot [4] (Fig.1.5). Let us introduce the concept of the unit (or specific) emotion, similarly to matter density in Physics [6] , Definition 1.5 . The specific emotion a(S(t),t ) of a robot is an emotion per single subject uni t. Obviously, the specific emotion satisfies the following relation: . ) ( ) ), ( ( ) ), ( ( t S t t S M t t S a  We can easily see that the sign of the robot’s emotion ) ), ( ( t t S M is determined by the sign of the specific emotion a(S(t),t ). 12 Emotions Subject Fig.1.5. Relation between robot’s emotions and a subject. Mathematical theory of emotional robots described in this book considers the cases shown in Fig. 1.4 and 1.5. 2. EDUCATION OF A ROBOT Let us introduce the definition of emotional upbringing ( emotional education ) of a robot abstracting from the psychological concept of education/ upbringing. Definition 2.1 . The upbringing or education of a robot is a relatively stable attitude of t his robot towards a subject. From Definition 1.3 it follows that the robot’s emotion M(t) is the continuous function on the segment   t , 0 , consequently M(t) is integrable on this segment. Considering that, we can work out the following de finition. Definition 2.2 . The elementary robot‘s education r(t) based on subjects S(t) is the following function: . ) ( ) ), ( ( ) ( 0   t d S S a t r     (2.1) The obvious mathematical features o f the elementary education are as follows: 1) if a specific emotion sign coincides with a subject sign, then the education is positive; 2) in virtue of Definition 2.3, the function r(t) is differentiable with respect to the parameter t , so the relation dt t dr t t s M ) ( ) ), ( (  is valid. 13 Let us consider that in the course of time a robot can forget emotions experienced some time ago. Its current education is less and less effected by those past (bygone) emotions. Consequently, past elementary educations initiated by those emotions become forgotten as well. Hence, the following definition becomes obvious. Definition 2.3 The education of a robot R(t) based on the subjects S(t) is the following function:     i i i t R t r t R     ) ( ) ( , (2.2) where t is the current time,   1 0 ,    t t t i i  . The current time satisfies the relation i t t    , where  is the current time of the current emotion effect from th e beginning of its initiation, i t is the total time of all the formerly experienced emotions effect,   i i t R is the education obtained by a robot in the time i t . A verbal definition of education i s as follows: it is a value determining motivation stability of the robot’s behavior on a certain class of subjects. It is obvious, that an education can be measured in bits of information similarly to a subject, and, consequently, emotions are to be measu red in bit per second ( bit/s ). Definition 2.4 . Coefficients   t i  are the memory coefficients of events experienced in the past, i.e. coefficients of the robot’s memory. According to (2.2) we can write down a relation specifying the ed ucation in the beginning of the i +1 st emotion effect upon the robot:     i i i i t R r R 0 ) 0 ( ) 0 ( 1     . It is easy to see that the eqs.   0 ) 0 ( , ) 0 ( 1    r t R R i i i . hold true. Consequently   1 0  i  is valid. Definition 2.5. A t ime step is the effect time of one emotion. According to results obtained by psychological researches an emotion cannot last more than 10 seconds. Therefore, let us assume that a time step value of any robot emoti on is less or equal to 10 sec. Here and below psychological characteristics of robots corresponding to a current moment of the time step are bracketed after the variable, and psychological characteristics corresponding to the end of time steps are denoted without brackets. For instance, ) ( t R i defines a function of education altering for the current time t of the valid time step i , and i R defines a value of education in the end of the time step i. 14 It is easy to see tha t the robot featuring the past event memory coefficient identical with 1 remembers in detail all its past emotional educations. This robot can be regarded as autistic. But let us suppose that the robot’s memories of the past events are deleted, i.e. the tw o - sided inequality 1 0   i  is valid for a forgetful robot in the end of each time step. We are now in position to state a theorem for this kind of robot. Theorem 2.1 . Educating the forgetful robot by means of positive emotions only leads t o satiety. Proof : It easy to see that Relation (2.2) is equivalent to   2 1 1 ) ( ) ( ) ( ) (       i i i i R t r t r t R    . (2.3) Equation (2.3) can take the form 0 1 2 1 3 2 1 2 1 1 ... ... ) ( ) ( r r r r r t R i i i i i i i i i i i i                          . (2 .4) Since all the emotions are positive, elementary educations are positive, too; since all the emotions are value - limited, and time of emotion effect is also limited, so elementary educations are also limited. This makes us conclude that there are  and q of a forgetful robot for which the following inequalities hold true: ) ( , , 1    r q r q k j     , (2.5) where      1 , 0 , , 1 i k i j . Due to (2.4) и (2.5) we can o btain the upper bound of the function R(t) variation. It will have the form          1 0 1 0 2 ) ( i j j i j j q q q t R   (2.6) The right side of (2.6) defines the sum of geometric progression terms, which yields inequality       1 1 2 ) ( 1 i q t R . (2.7) Having passed to the limit under   t or   i in the right side of (2.7) we ge t the upper bound of the education value:    1 2 ) ( q t R . (2.8) 15 Inequality (2.8) makes us conclude that the robot’s education based on positive emo tions has the upper bound, i.e. it is satiated. The proof is now complete. Psychological researches entirely confirm Theorem 2.1. According to their results, it is not possible to bring up and train a person ad infinitum, as at some certain moment he \ she gets satiated [4], and passes to the next stage of his \ her emotional activity. Definition 2.6 . The limiting education U is the value corresponding to the end point of emotion effect time and satisfying the relation    1 q U . Definition 2. 7. Emotions initiating equal elementary educations are tantamount (equivalent). Definition 2.8. A uniformly forgetful robot is a forgetful robot whose memory coefficients corresponding to the end point of emotion effect time are constant and equal to eac h other. Theorem 2.2 . The education i R based on tantamount emotions of the uniformly forgetful robot is defined by the relation      1 1 i i q R , where q is the elementary education value, and i is the order number of the ini tiating tantamount emotion from a quantity of emotions on which basis this education has been being performed by the current time point. Its proof is evident from Theorem 2 .1. Also let us note the following. When performing a robot’s emotion by means of software, it is impossible to predict the subject effect time. Therefore it is expedient to model the emotions after subject effect is over. Example: Let us take the emotion function in a form            * * 0 sin ) ( t t t t P t M  , (2.9) with const P  ,   0 * , t t t  , 0 t the fixed value, at that   * * 0 2 , t t t  . In (2.9) we replaced conditions 1, 2, 4, 5, 8 in the definition of emotion by the fol lowing: 1. Function domain M(t) :   0 * , t t t  ; 16 2. * 0 2 t t  ; 4. 0 ) ( *  t M ; 5. 0 ) ( 0  t M ; 8. Function domain contains an only point z , such that 0 * , t z t z   and 0 ) ( /   z t dt t M d . Also let us note the following: according to (2.9), replacements of several conditions of belonging of the robot’s inner emotional experience function M(t) to emotions do not require the currently considered theory to be revised. Obviously, the time step  for Emotion (2.9) satisfies * 0 t t    , and the elementary education r is computed by       P t t P dt t t t t P r t t 2 2 sin 0 * * 0 * * 0              . (2.10) We can easily see that during the education process Eq . (2.10) provides tantamount emotions under const t t    * 0  . Let us consider all the time step s to be equal to each other. Below we give a theorem which mathematically characterizes deletion of the past \ bygone education memory data if those educa tions are not maintained by emotions with the course of time. In this case the index i is defined by the relation         t i , with t the current time,  the effect time of the first and only emotion causing the elementar y education 0 r . Theorem 2.3 . The uniformly forgetful robot forgets its first and only elementary education exponentially. Proof . According to (2.4), if there is no constant emotional effect during some period of time, then the robot’ s education by the time t satisfies the relation 0 1 2 1 ... r R i i i i        . (2 .11) As far as the robot is uniformly forgetful, so     i j with const j , 1 ,   is valid. Consequent ly, 0 r R i i   holds true. The proof is now complete. 17 The next theorem allows assessing the upper bound of the forgetful robot’s current education in case when this robot had obtained only one elementary education in the past. Theorem 2.4 . T he current education of the forgetful robot obtained due to an only positive elementary education satisfies the inequality 0 1 ) ( r t R i    , with    i j j , 1 ,   . Its proof is evident from (2.11). Above we noted validity of dt t dr t M ) ( ) (  . (2.12) Assuming that memory coefficients are differentiable functions and taking into consideration (2.12) we get the formula for the sum ( i.e. result ing) emotion V(t) : ) ( ) ( ) ( ) ( ) ( 1 1 t dt t dR dt t d R dt t dr t V i i i i        . (2.13) (2 .13) allows us to assert that sum emotions of the robot depend on past educations, memory coefficients and their rate of change. It is quite easy to see that for the robot with the absolute emotional memory (    i j j , 1 , 1  ) current sum emotions are not dependent on past educations. Let robot’s elementary educations satisfy the following inequality: q r j  . (2.14) Under i tending to infinity and the inverse numeration of elementary educations, (2.4) takes the form:        1 1 1 1 i j i j i П r R  . (2.15) Definition 2 .9 . The robot’s education corresponding to (2.15) is an infinite education. Let us note that the infinite education convergence determines education prospects. 18 Theorem 2.5 . For the f orgetful robot, the infinite education corresponding to ends of time steps converges. Proof . Let us show that Series (2.15) is absolutely convergent. As 1 0   i  holds true, so there is such  less than unity, that 1     i (with    , 1 i ) is valid. By virtue of Inequality (2.14), Formula (2.15) and formula for finding a sum of terms of a geometric progression [7] we develop a correlation               1 0 1 1 1 1 | | i i i j i j i q q П r    . So, Series (2.15) is absolutely convergent, consequently it converges. The proof is now complete. Quod erat demonstrandum. By virtue of the theorem given above, the relation 1 lim lim lim lim             i i i i i i i i R r R z  is valid for the end of each time step o f the continuous education process, and this relation is equivalent to z r z i i i i        lim lim . (2.16) (2.16) allows to enunciate the following theorem. Theorem 2.6 . The un iformly forgetful robot’s elementary education corresponding to ends of time steps in the course of continuous education process tends to be constant. Proof . As       , 1 , 1 i const i   , holds true for the uniformly forgetful robot, by virtue of (3.16) t he elementary education sequence corresponding to ends of education time steps, has a limit. Thus the theorem is proved. Corollary 2 .1 . For the uniformly forgetful robot   z r i i      1 lim is valid. The proof follows from (2 .16). Let us assess the e xtent of error of the infinite education value provided when k terms of series are used for assessing the sum of Series (2.15). It is easy to see that the inverse numeration of elementary educations makes the error of 19         1 1 1 1 k i j i j i k П r b  satisfy      1 1 k k q b under finite summation of k terms of series. Obviously, an education cannot be performed continuously: after the series of emotional effects there comes a slack period in this education. Let us introduce a supplementary definition. Def inition 2.10 . A complete education cycle is a quantity of time steps equal to the sum of time steps under the effect of education emotions and a number of time steps corresponding with the slack period (absence of elementary education effects upon the robo t) till the next emotional education effect. Let us consider the education process of the uniformly forgetful robot with tantamount emotions. It is easy to see that according to Theorems 2.2 and 2.3 the education 1 1 , k j F for the first com plete education cycle of the uniformly forgetful robot based on tantamount emotions with equal periods satisfies the following relation:       1 1 1 1 1 1 , j k k j q F , (2.17) where 1 j is the quantity of time steps in the presence of education effects upon the robot, 1 k is the quantity of time steps in their absence. Obviously, the education n n k j F , , obtained by the robot as a result of n complete education cycles is determined by the equality               1 1 , , 1 1 n n n n n n n k j j j k k j F q F     . (2.18) From the forms of Relations (2.17) – (2.18) it follows that n n k j ,  , set by the equality q F n n n n k j k j , ,   , does not depend on q . Since q = const is valid, then n n k j ,  is a unitless measure for assessing the education obtained by the robot in n complete education cycles. Definition 2.11 . The function n n k j ,  is a memory function. It is evident that the memory function shows to what extent tantamount educational emotions are memorized by the robot in the course of the educational process. Let U defines the value equal to the maximal (satiated) education. Assu ming that emotions are tantamount and memory coefficients are equal to one and the same 20 constant, we pass to the limit in both parts of Equality (2.2) under the quantity of time steps tending to infinity. As a result we get q U t r i      ) 1 ( ) ( lim  . So, the robot’s education R , obtained in the first complete education cycle is determined by the formula   U R j k 1 1 1     . It easy to see that the funct ion ) , ( 1 1 j k G , satisfying the relation   1 1 1 ) , ( 1 1 j k U R j k G      , (2.19) determines deviation of the education from its satiety: the closer is ) , ( 1 1 j k G (with the given values 1 k and 1 j ) to 1, the closer the robot’s education is to its satiety, and vice versa. Definition 2.12. The function ) , ( 1 1 j k G is a satiety indicator. It is easy to see that the satiet y indicator for the fixed 1 k and 1 j has a maximum value when the condition 1 1 1 1 1 j k j k            (2.20) holds true. Inse rting (2.20) to Relation (2.19) we get the formula specifying the maximal value max G of the satiety indicator in the end of the first complete upbringing cycle. 1 1 1 1 1 1 max 1 1 k j j k j k G j k            . D efinition 2.13 . The function U F B n n n n k j k j , ,  is a complete satiety indicator. In the conclusion of this chapter we give several statements concerning the non - uniformly forgetful robot with non - tantamount emotions. It easy to see that for this kind of robot in the end of n complete education cycles the general education function ] [ , n i l n n V , defining the education obtained during those cycles, satisfies the relation 21                                          ] 1 [ , ] [ 1 ] [ 1 1 1 ] [ 1 ] [ 1 ] [ 1 ] [ , 1 1 p i l p i i i p j k j i k p k p i p k l k p i l p p p p p p p p V П П r r П V    ,   n p , 2                             ] 1 [ 1 1 1 ] 1 [ 1 ] 1 [ 1 ] 1 [ 1 ] 1 [ , 1 1 1 1 1 j k j i k k i k l k l i П r r П V   , where   i denotes variables corresponding to the i - th education cycle,   n i , 1 , ] [ p k   corresponds to memory coefficients of the p - th cycle for time steps without emotional educations, k is the number of the time step without emotional educations, p l is the quantity of time steps in the p - th cycle without emotional effects, p i is the quantity of time steps in the p - th education cycle with continuous emotional ed ucation effects. Obviously, for the forgetful robot the following inequalities are valid: p p p p i l p i l F V , ] [ , | |  ,            1 1 , , 1 1 p p p p p i p l i l i l F q F      n p , 2 , 1 1 1 1 , ] 1 [ , i l i l F V  ,     1 1 1 1 1 , i i l q F , where ) , max( ] [ ] [ p i p j      ,   p i i , 1 ,   p l j , 1 ,   n p , 1 . Let us introduce the following definition. Definition 2.14. The generalized memory function ] [ , n i l n n W is a value satisfying the relation q V W n i l n i l n n n n ] [ , ] [ ,   . Definition 2.15. The generalized education satiety indicator is the function   q V W n i l n i l n n n n    1 | | ] [ , ] [ , . Based on the definitions given above we conclude that the generalized memory function and the generalized education satiety indicator are unitless functions. It is obvious that the generalized education satiety indicator satisfies the inequality 1 0 ] [ ,   n i l n n W . 22 3. PARAMETERS OF A GROUP OF EMOTIONAL ROBOTS Let us consider a problem connected with studying emotional conditions of the group of robots. The th eory given below represents one of attempts to formalize mathematically the solution of this problem. Definition 3.1 . The sum ( i.e. resulting) education of the group including n robots belonging to the set Ω n based on the subject S(t) is computed as follows: . ) ( ) ), ( ( 0     d S S a W n n i t i       (3.1) Suppose we have two groups including p and k robots and forming two sets Ω p , Ω k correspondingly, where           k p n k p , ,       k p , . Let us find out when the utmost psychological conflict between those groups can occur on one and the same class of subjects. It is obvious that, for instance, hatred (odium) is determined by opposite - signed sum edu cations of rival groups; also it is obvious that the equality 1     p k W W (where 0   p W ) is to hold true so that the utmost confrontation between robot groups become possible. The converse proposition is valid: If a sum education of two groups is equal to zero and an education of at least one robot is nonzero, then the utmost confrontation is most likely possible between two groups of robots. Below we give the proof of this statement: Suppose 0   n W , then k and p can be selected so that k + p = n , as well as Ω k and Ω p can be selected so that 0       p k n W W W is valid, i.e. 1     p k W W under 0   p W , which required to be proved. Based on this we get Theorem 3.1 . The necessary and sufficient condition for the utmost confrontation bet ween robot groups including at least one robot with a nonzero education is that the sum education of those groups equals to zero. Obviously, the farther is k W  from zero, the worse is the confrontation. The given theorem helps us to defi ne the most rival pairs of robots or robot groups. To find out the pairs of rival groups it is enough to calculate each robot education and then obtain a set of all possible sum educations (e.g., by enumerative 23 technique, manually or by computer). Sets of robots with sum educations close to zero make up rival risk groups. It is easy to see that the greater the sum education of a group differs from zero, the more united (or, better say, more serried) this group is. Suppose the sum education of members of the first group obtained in the course of several complete education cycles ] 1 [ W satisfies the relation    n j k i j p k j p V W 1 ] 1 [ , ] 1 [ ] 1 [ ] 1 [ , and the corresponding sum education of the second group is computed by the formula    m j k i j p k j p V W 1 ] 2 [ , ] 2 [ ] 2 [ ] 2 [ , whe re the index [1] or [2] denotes belonging to Group 1 or Group 2, n is a quantity of robots in Group 1, m is a quantity of robots in Group 2. Then the condition of rivalry between those groups is defined by the relation 0 ] 2 [ ] 1 [   W W , which is eq uivalent to 0 1 ] 2 [ , 1 ] 1 [ , ] 2 [ ] 2 [ ] 1 [ ] 1 [       m j k i n j k i j p k j p j p k j p V V . Definition 3.2 . Re - education (re - bringing) is change of the education sign to the opposite one. Obviously, Group 1 including k robots can re - educate Group 2 including p robots in its favour if the equality Q W W p k    where 1   Q , p k W W    , 0    k p W W holds true by the beginning of the re - educating process. The greater Q differs from - 1, the more effective is this re - education. Definit ion 3.3 . There is an emotional conflict in the group at the time 0 t if the sum of emotions of each member in the group is equal to zero, i.e.    n i i t M 1 0 . 0 ) ( Obviously, if at the time 0 t sum emotions and e ducations of members of the group are equal to zero, then there is the open conflict threat at its utmost stage. Let us consider conditions of the conflict between uniformly forgetful robots with tantamount emotions. According to the definitions given abov e, the limiting education of the first uniformly forgetful robot 1 U educated by tantamount emotions, satisfies the relation 1 1 1 1    q U , and the limiting education of the second tantamount emotions, is 24 defined by uniformly forgetful robot 2 U also educated by the relation 2 2 2 1    q U where 1  and 2  are memory coefficients, 1 q and 2 q are values of the correspond ing elementary educations. Suppose in the course of an infinite education process robots come to an education conflict. This implies that the formula 2 1 U U  is valid, and so is the relation 2 2 1 1 1 1      q q . (3.2) Equality (3.2) allows us to compute the approximate interdependence of memory coefficients of two uniformly forgetful robots conflicting on tantamount emotions:   1 2 1 2 1 1 q q      . (3.3) It is obvious, that if coefficients 1  and 2  are not connected by Relation (2 .6), then Robot 1 and Robot 2 will never come to an education conflict at the limit. Above in Chapter 2 we showed that in the course of j continuous education effects on Robot 1 and i continuous education effects on Robot 2 the corresponding educations can be described as 2 2 2 ] 2 [ 1 1 1 ] 1 [ 1 1 , 1 1           i i j j q R q R . Then the condition of the onset of the conflict in the education process can be computed by the equality 2 2 2 1 1 1 1 1 1 1          i j q q . (3.4) But we can state that if memory coefficients 1  and 2  are not connected by Relation (3.3), then the conflict between robots ceases with time by itself, i.e. without any extra emotional ef fects different from already existing emotion effects. 4. FRIENDSHIP BETWEEN ROBOTS: FELLOWSHIP ( CONCORDANCE ) This chapter represents an attempt to introduce the term and concept of “friendship between robots”, which we prefer to characterize as fellowsh ip or concordance of robots. Here we introduce a couple of definitions. Definition 4.1 . The group of robots is a united fellowship if individual educations of each member are positive. 25 Definition 4.2 . If individual educations of a fellowship are not less than 0 0  P , then 0 P is the fellowship value of this group. Theorem 4.1 . There exists ξ such that a fellowship value of a fellowship is ξ . Proof : As this group of robots is a fellowship, then individual educations ) , 1 (   n i R i of each member satisfy the condition 0  i R . Therefore there exists a value ξ >0 such t hat the inequalities    n i R i , 1 ,  hold true. This completes the proof of the Theorem. Definition 4.3 . Suppose individual educations of a group including n robots are positive. A sum (total) fellowship value of n robots is a sum of all individua l education values of robots in this group. Assume that a set of n robots is divided into two sub - groups. Suppose the first Sub - group including m robots is more united and affinitive of the two fellowships, and its fellowship value is 0 P . So, the sum/total fellowship value of the first Sub - group P is computed by the equality 0 mP P  . Assume the second Sub - group includes n - m robots and has a fellowship value 0 R . Then the sum/total fellowship value of the first Sub - group A is defined by the equality 0 ) ( R m n A   . Obviously, the sum/total fellowship value R of two sub - groups is defined by the formula 0 0 ) ( R m n mP A P R      . (4.1) Assume the inequality 0 0 R P  holds true. Suppose members of the second Sub - group are robots with equal tantamount emotions q and uniformly forgetful with equal memory coefficients  . We state the following prob lem: let us define the education condition for robots of the second Sub - group, under which it is possible for the fellowship coefficient of the second Sub - group to become equal or more than the fellowship coefficient of the first Sub - group as a result of e ducation of robots in the second Sub - group. Based on (4.1) we conclude that this condition is determined by the inequality 0 * 0 ) ( nP R m n mP    , (4.2) where * R is the education value of each robot in the second sub - group after the education process had started. It is easy to see that Relation (4.2) is equivalent to the formula 0 * P R  . (4.3) 26 Let us effect simultaneously on each robot of the second sub - group by tantamount emotions until Condition (4.3) becomes to hold true. Obviously, by the end of the education process the relation 0 0 1 1 P R q j j        is to hold true, where j is a quantity of education process time steps for robots of the second sub - group. So, for finding the least quantity (number) of the necessary education time steps with the given memory coefficients of robots of the second sub - group we are to solve the following problem: solve for              0 0 1 1 1 min P R q j j j    ( 4 .4) under 0 1 1 0 0      P R q j j    . Let us prove the theorem. Theorem 4.2 . If the relation 0 0 1 P R q     is valid, then Problem (4.4) has no solution. Proof . Since robots in the second sub - group are uniformly forgetful, then the two - sid ed inequality 1 0    holds true. So, Theorem 4.2 statement yields a formula valid for any time step value j : 0 0 1 1 P R q j j        This formula implies that the limiting condition in P roblem (4.4) is never to hold true. Therefore, this task has no solution under this theorem statement. This completes the proof. In other words, the theorem implies the following: “education effects not necessarily make robots achieve equal fellowship (i. e. concordance) between members of the group with the given fellowship value”. 5. EQUIVALENT EDUCATION PROCESSES Definition 5.1 . The equivalent education process is a continuous education process corresponding to an education with tantamount emotions, e qual memory coefficients and featuring the minimal deviation at all the education assessment node points from the values of a real continuous education process of a robot. 27 5.1. MATHEMATICAL MODEL OF EQUIVALENT EDUCATION PROCESSES Suppose education valu es of a real continuous process are established in the end of each period by values   n j R j , 1 , , where n is a total quantity of education time steps. Also suppose conditions       1 , 1 , 0 1 n j R R j j . (5.1) are valid. Now we approximate the real education process to an equivalent education process. To do this we need to find such q ,  under which the objective function reaches its minimum                  n j j j j q R R q J 2 2 1 1 1 1 1 ) , (     . (5.2) So, in order to develop the equivalent education process we need to solve the equation set 0 ) , ( , 0 ) , (       q q J q J    . (5.3) Considering Relation (5.2), Equation set (5.3) in its expanded version takes the form:                      n j j j j j q R R 2 1 1 1 1 0 1 1 1     , (5.4)                                           n j j j j j j j j q R j q R R 2 2 2 1 1 2 1 1 1 1 1 1 1 1 1 1         (5.5) Since for adequately selected time steps the solutions of Equation sets (5.4) – (5.5) have to satisfy the conditions 1 0    , 0  q , (5.6) then, due to checking on validity of (5.6) we can estimate adequacy of the equivalent process to the real education process. Coefficients q ,  solved out of Eqs. (5.4) – (5.5) allow us to find approximately the limiting value of the education of the continuous process Z . Obviously, Z satisfies the relation 28                       1 1 1 lim 1 1 1 q q R Z j j j . Let us assess the error in calculation s of the limiting education in the real continuous process through the equivalent education process. According to the formula of continuous education in the real process, the relation 1 1 1 R П R r R k j k j j j j        . (5.7) holds true. In (5.7) we pass to the limit under the time tending to infinity: 1 1 1 lim lim lim lim lim R П R r R k j k j j j j j j j j j                  . (5.8) According to the theorem of education convergence, the relation 0 lim     D R j j holds true. Hence, Relation (5.8) is tantamount to the equality D r D j t j j        lim lim . So the value D satisfies the relation . lim 1 lim j j j j r D        (5.9) Suppose the inequality          1 lim 1 lim q r j j j j (5.10) holds true. Considering the last inequality and Relation (5.9) we get the following formula:                                                        1 1 1 1 1 1 1 lim 1 lim M q q M q M q r Z D j j j j , (5.11) where j j j j r M   max , max    . 29 Let us consider the case corresponding to the inequality          1 lim 1 lim q r j j j j . Obv iously, in this case the limiting education error estimate satisfies the relations                                               1 1 ) 1 ( 1 1 1 lim 1 lim 1 M q M q r q D Z j j j j , (5.12) where        , 1 , min , min j r M j j   . Relations (5.11) and (5.12) allow us to get the error estim ate X of the limiting education under approximation of the real process to the equivalent education process. Obviously, in the general case it can be found by the formula                                                                              1 1 ) 1 ( 1 , 1 1 1 1 max M q M q q M X . Analyzing Formulas (5.11) and (5.12) we ca n state that the worse is the robot’s emotional memory the less is the error estimate of the limiting education. Also, (5.11) and (5.12) allow us to state that the formula      1 lim q R j j (5.13) holds true if the matter concerns a forgetful robot. By virtue of (5.1), Relation (5.13) allows us to find approximately the limiting education of a robot for the real educating process on the basis of the equivalent educating process. It is easy to see that (5.9) implies the relation       , 1 , 1 j M R j  which is the upper bound of the education value of the forgetful robot’s real education process. 5.2. ALTERNATIVE TO AN OBJECTIVE FUNCTION UNDER COINCIDENCE OF T IME STEPS OF REAL AND EQUIVALENT EDUCATION PROCESSES Let us introduce a simpler objective function such that its minimization can give us the coefficients  and q which define the equivalent education process 30   2 2 1 ) , (       n i i i R q R q J   . Validity of this objective function for designing an equivalent education process follows from the formula of education of a robot with tantamount emotions and equal memory coefficien ts: 1    i i R q R  . In order to minimize this function let us solve the following equation set:              . 0 ) , ( , 0 ) , ( q q J q J    Now we are to find the corresponding derivatives:                               ). 1 ( 2 ) , ( ), ( 2 ) , ( 2 1 1 2 1 n i i i i n i i i R q R q q J R R q R q J      Then the system takes the form                         . 0 , 0 2 1 1 2 1 n i i i i n i i i R q R R R q R   . Now simplify this and get                             . 0 ) 1 ( , 0 ) ( 2 1 2 2 2 1 2 1 2 1 n i i n i i n i i n i i n i i i R n q R R R q R R   The system i s linear relative to  and q so let’s express  and q as i R . Out of the second equation we get 1 2 1 2         n R R q n i i n i i  . Substitution of q into the first equation gives 31 , 0 1 ) ( 1 0 ) ( 1 1 0 ) ( 1 2 2 1 2 2 1 2 1 2 2 1 2 2 1 2 2 1 2 1 2 2 1 21 2 1 2 1 2 1 2 2 1                                                                                              n R R n R R R R R n R n R R R R R R n R R R R n i i n i i n i i n i i n i i i n i i n i i n i i n i i n i i i n i i n i i n i i n i i n i i i      . ) ( ) 1 ( ) 1 ( 1 ) ( 1 2 2 1 2 2 1 2 1 2 2 1 2 2 1 2 2 1 2 1 2 2 1                                                                     n i i n i i n i i n i i n i i i n i i n i i n i i n i i n i i i R R n R R R R n n R R n R R R R  Consequently, 1 ) ( ) 1 ( ) 1 ( 2 1 2 2 1 2 2 1 2 1 2 2 1 2                                   n R R R n R R R R n R q n i i n i i n i i n i i n i i n i i i n i i So, under known education values of the real education process of a robot n i R i , 1 ,  we get unique values of  and q for which the conditions 0 , 1 0    q  are to be valid. I f the obtained values satisfy all the limitations mentioned above, then the coefficients  and q define the equivalent education process. If the obtained values do not satisfy those limitations, then it is not po ssible to develop any equivalent education process with the same time steps as in the real education process and with the corresponding educations n i R i , 1 ,  of the real education process. The obtained coefficients  and q allow us to find approximately the limiting value of the real education process. Let Z be the limiting value; then . ) ( lim 1 Z q R q Z i i          Out of this we get    1 q Z . According to the formula of the continuous education process the relation . 1    i i i i R r R  32 is valid. Having passed to the limit in this relation we get . lim lim lim lim 1            i i i i i i i i R r R  According to Theorem 2.1 of forgetful robot’s education convergence at positive emotions, the relation 0 lim     D R i i holds true. Hence, we get . lim 1 lim , lim lim i i i i i i i i r D D r D               Let          1 lim 1 lim q r i i i i ; then we get the following formula: , ) 1 )( 1 ( ) 1 ( ) 1 ( 1 1 1 lim 1 lim 1 1 1 1 1                            q M q M q r Z D i i i i with:     , 1 , max , max 1 1 i r M i i i i   Let us consider the case when          1 lim 1 lim q r i i i i , out of it we get the following formula: , ) 1 )( 1 ( ) 1 ( ) 1 ( 1 1 lim 1 lim 1 2 2 2 2 2                            M q M q r q D Z i i i i where     , 1 , min , min 2 2 i r M i i i i   The obtained relations are necessary for computing an error of the limiting education under approximation of the real education process to the equivalent education process. T he error X is found by                    ) 1 )( 1 ( ) 1 ( ) 1 ( , ) 1 )( 1 ( ) 1 ( ) 1 ( max 2 2 2 1 1 1         M q q M X . Analyzing the inequality described above we conclude that the worse is the robot’s emotional memory, the less is the error of limiting education computing. E xample . Let us consider an example of equivalent education process development. 33 Suppose the real education process includes three education time steps 3 2 1 , , R R R with 4 , 3 , 1 3 2 1    R R R . By the formulas given above we find  and q , and get 5 . 2 2 5 2 4 * 5 . 0 7 5 . 0 2 1 16 10 * 2 4 * 7 15 * 2          q  At that, 0 , 1 0    q  are valid. So, we obtained an approximation of the real education process including three time steps with the real education 4 , 3 , 1 3 2 1    R R R to the equivalent education process with tantamount emotions under 5 . 2  q and equal memory coefficients 5 . 0   . Based on the obtained values, we can find the approximate value of the limiting education Z . Simple calculations lead to the following relation: 5 1     q Z . 5.3. GENERALIZATION IN CASE OF NONCOINCIDENCE OF TIME STEPS OF REAL AND EQUIVALENT EDUCATION PROCESSES Speaking about generalization, assume that a number of education time steps in the equivalent education process may differ from their number in the real education process. For instance, the end of the second time step of the real education process may coincide with the end of the secon d or more time step of the equivalent education process. Noncoincidence of time steps for education processes can occur due to randomness in timing of educations of the real education process. Education values of the real process can be approximately rest ored for each time step in the course of development of the equivalent education process. Assuming that the equivalent education process is continuous, we can suppose that during each time step our robot is effected by a tantamount emotion with the element ary education q . It is easy to see that the objective function can be presented as follows: 2 1 1 1 1 ) ,..., , , (               n i i j i n q R j j q J    , (5.14) where i R is the education value of the real education process after the time step i , and     1 1 i j q characterizes the education obtained as a result of the equivalent education process after the time step i j . So, in order to develop the equivalent education process it is necessary to minimize Objective function (5.14). For that we need to solve the following equation set: 34 0 ) ,..., , , ( 1      n j j q J , 0 ) ,..., , , ( 1    q j j q J n  . Then the equation set for finding will take a form                                                 0 , 1 0 , 0 1 1 , 0 1 ) 1 ( 1 1 1 1 1 q q R j q R n i i j i n i i j i j i i j i         Example . Assuming that 10 , 6 , 3 3 2 1    R R R hold true and applying the cyclic data search method for Objective function (5.14) minimization with an enumeration step equal to 0.1 for q and  , and an enumeration step equal to 1 for i j , and with variation intervals of q between 0.1 and 2.9,  - between 0.09 до 0.99. i j - between 1 до 100, we get the following values: 2 . 0  q , 99 . 0   . 16 1  j , 35 2  j , 69 3  j . Obviously, the limiting education equals 20. The computation results show that under found parameters of the equivalent education process the value of (5.14) equals 0.0056, i.e. the developed equivalent education process approximates the rea l one quite closely. 6. METHOD OF APPROXIMATE DEFINITION OF MEMORY COEFFICIENT FUNCTION In Chapter 2 we proved the following equality for the beginning of each time step:     , 1 , 1 ) 0 ( i i  . (6.1) Now let us express the memory coefficients ) ( t i  in the following form i i i b t a t   ) (  , where i i b a , are constants which are no t dependent on the current time t of emotion effect. According to (6.1) and relations for finding the coefficients i i b a , we can work out the following equations system: , 1 0   i i b a (6.2)        i i i i b t t a 1 (6.3) 35 with i i t t , 1  , the time of the beginning of the i - th time step;  , the memory coefficient of the equivalent process. We obtain relations allowing us to find the unknown values in Equation system (6.2) – (6.3) provided that parameters of the equivalent process are found on the basis of The objecti ve function given in Section 5.2. It is easy to see that the sought - for values are found by the explicit formulas 1  i b ,   1 2 2 2 1 2 1 2 1 2 2 1 1 ) 1 ( ) 1 (                               i i n i n i i i n i i n i i n i i i i t t R R n R R R R n a , where n is the number of time steps for which successive value s of the robot’s education i R are known, as well as time step which are defined by the values i i t t , 1  ,   n i , 1 . 7. MATHEMATICAL MODEL OF FORMING TANTAMOUNT ROBOT SUB - GROUPS This chapter describes one of the ways to make up groups of robots with equal sum educations. Let us consider a group of k robots, where each robot has its order number i , where   k i , 1 . Suppose the robot i has its education i R . Then the sum educ ation of the group of robots A satisfies the relation    k i i R A 1 . Problem : Out of the set  including all the robots, let us make up sub - groups which are nonoverlapping subsets ) ( , 1 , k n n p p     ,      p n p 1 , so that sum education values of the obtained sub - groups are least different from each other. Let us give the following definition and prove the auxiliary theorem. 36 Definition 7.1 . The average education p F of the group p is a value satisfying the relation p j j p N R F p     , where p N is the quantity of robot units in the set p  . Theorem 7.1 . The sum education A satisfies the equality . 1    n i i i F N A Proof . It is easy to see the validity of the equality chain         i i j j i j j i i i R N R N F N . (7.1) Summing (7.1) with respect to all the values i we get             n i j k s s j n i i i i A R R F N 1 1 1 , i.e. . 1 A F N n i i i    The proof is complete. Let us introduce the objective function in a form:           1 1 1 2 n i n i j j j i i F N F N J . Now the problem put above can be mathematically described as follows: solve for           F N J i i F N , min , (7.2) under limits          n i n i i i i i n i N A F N k N 1 1 , 1 , 0 , , . Problem (7.2) deals with determination of conditional extremum of function of several variables, so it can be eas ily solved by the well - known Lagrange method. As a result of applying the Lagrange method to the solution of this Problem we get the roots of the following equation system:             n i j i j j i i i n i F F N F N F 1 2 1 , 1 , 1 , 0 2       n i i k N 1 , 0   , 0 2 1 1 2       n i n n i i F N F N  (7.3) 37 , 0 1     n i i i A F N            , 1 , 1 , 0 2 1 2 n i F N F N n i j j j i i           1 1 2 1 , 0 2 n i n n n i i n F F N F N F   where 2 , 1   are the Lagrange method auxiliary variables. In the general case, the question about existing and uniqueness of the solution of the nonlinear algebraic equation set (7.3), and about mathematical ways of its solution is still open - ended. Now let us consider the task which is a little bit different, though similar to Problem (7.2) in its statement. In this new problem statement we suppose that the quantity of robots p N in the groups p  is already predetermined. It is quite easy to see that in this case the mathematical statement of the problem will have the f ollowing form: solve for          F J i F min (7.4) under    n i i i A F N 1 According to the Lagrange method, Problem (7.4) solution is reduced t o just finding the roots of the linear equation set            n i j j j i i n i F N F N 1 , 1 , 1 , 0 2  , 0 1     n i i i A F N (7.5)   , 0 2 1 1        n i n n i i F N F N where  is the Lagrange method auxiliary variable. It is easy to show that the major equation determinant in this equation set is nonzero (e.g., the case when 2  n means that th e group is split into two sub - groups), i.e. with such n this set always has a unique solution. Definition 7.2 . Sub - groups with the values   n i F i , 1 , obtained in the solution of Problem (7.4) are tantamount ones. Definition 7.3 . Sub - groups wi th the values   n i F i , 1 , which are the solution of Problem (7.4) and which make the objective function J reach its minimum equal to zero are absolutely tantamount sub - groups. 38 Let us define simple conditions under which the sub - groups being for med are absolutely tantamount. The minimum of the function          F J obviously equals to zero when the relations           n i i i j j i i A F N n i j n i F N F N 1 . , , 1 , 1 , 1 , hold true. It is easy to see that under 2  n the sub - groups become absolutely tantamou nt when the relations 2 2 1 1 2 , 2 N A F N A F   hold true. The solution of Problem (5.9) allows us to get numerical values of abstract average educations which may not coincide with real average educations of sub - groups being formed. This is connected with the fact that average educations of all real sub - groups are known values and, consequently, absolutely tantamount sub - groups might not be obtained basing on educations of single robot units. This is also the reason why it is not always possible to split a set of robots into tantamount sub - groups. 8. ALGORITHM FOR FORMING TANTAMOUNT SUB - GROUPS OF ROBOT Below we give an algorithm for making up real robot sub - groups closest to tantamount ones: 1. Set up values n N N ..., , 1 determining a quantity (n umber) of robots in each sub - group being formed, with    n i i k N 1 . 2. Make up the array Z of different sets   q y y N y N n Z 1 , , ,..., 1     ( q is the quantity of set pools in the array Z ), such that . , 1 , , 1 , , , , , , 1                n j n i j i y N y N y N n i j i i 3. Based on Step 2 find the value of the function          F J for each pool of sets y N y N n , , ,..., 1   . 39 4. Define numbers of y for which the corresponding sets make the objective function          F J reach its minimum. 5. Arrange a visual output of sets y N y N n , , ,..., 1   , corresponding to the minimum values of          F J . Note that performing Step 2 on a computer one may use well - known computer algorithms of combinatory analysis given in [8]. Having selected the sets includin g robot sub - groups w ith the closest sum educations, we can assess their equivalence, i.e. to what extent those sub - groups are tantamount towards each other, by comparing average educations of those sub - groups to the values i F which are the solution result of Problem (7.4). For assessing the closeness V of the formed sub - groups to the tantamount ones, we suggest applying the following formula: i i i i i D n i F F D V , , 1 , max     are real average educations of each of formed sub - group. Obviously, the nearer is V to zero, the closer are the formed sub - groups to each other. To detect sub - groups of robots grouped according to their education levels out of a general set we suggest applying well - known algorithms of cluster analysis [9]. These algorithms may , for instance, help to detect either robots belong to leading or lagging sub - groups. 9. APPLYING VECTOR ALGEBRA RULES TO INVESTIGATION OF ROBOT SUB - GROUP EMOTIONAL STATE Here and below we use Cartesian rectangular coordinates. Definition 9.1 . A robot’ s education based on n emotion types is the vector   , ,..., ,..., , 2 1 n j R R R R R   where each element of the vector of education based on single - type emotions is defined according to Relation (2.2). Introducing vectors of educations and emotions allows us to use rule s of vector algebra in mathematical operations with educations and emotions. Thus, the group education R including m robots can be found by the formula     m k k R R 1 , (9.1) and the group emotion M can be found by 40     m k k M M 1 , (9.2) where k is an order number of a robot in its group. Note that with m 1, q > 1) and given 1  and 2  Eq. (15.4) is not valid. Let us introduce one more definition. Definition 15.4 . Anti - stupor coefficients are the memory coefficients 1  and 2  for which under any integral values j and q ( j > 1, q > 1) Eq. (15.4) does not become valid. Theorem 15.5. Anti - stupor memory coefficients exist. Proof . Let us show that there exist the memory coefficien ts 1  and 2  which are not the roots of Eq. (15.4) under any integral values j and q ( j > 1, q > 1). 63 Obviously, Eq. (15.4) is equivalent to       0 1 1 1 1 1 2 1 2       q q j       . (15.5) Suppose the following equalities 3 1 , 2 1 2 1     (15.6) hold true. If we substitute Eqs. (15.6) into Eq. (15.5) and make tran sformations, as a result we get     0 2 2 1 3 1 2 3 1 1 1         q q j q j . (15.7) Considering that 1 3   j y , Eq. (15.7) takes the form     0 2 2 1 1 2 3 1 1        q q q y y . (15.8) Solving (15.8) relative to y we get the formula 2 2 2 1 1      q q y equivalent to the relation 2 2 2 3 1 1 1       q q j . (15.9) Since according to the theorem statement j > 1 is valid, then for any j and any q > 2 the positive value in the left part of Eq. (15.9) is equal to the negative value in the right part of Eq. (15.9). So, we get the contradiction. Consequently, 3 1 , 2 1 2 1     are not the roots of Eq. (15.4) with any values j > 1 and q > 2. Now let us consider the case when q =2. It easy to see that Eq. (15.8) in this case takes the form 2 = 0, i.e. under the memory coefficients 3 1 , 2 1 2 1     this equation has no solution. So, with any j > 1, q > 1, there are such memory coefficient values under which Eq. (15.4) makes no sense. Consequently, anti - stupor memory coefficients do exist. This completes the proof of Theorem 15.5. Corollary 15.5 . For t wo players the c oefficients 3 1 , 2 1 2 1     are anti - stupor memory coefficients. Its proof is evident directly from argumentations given in the proof of Theorem 15.5. 64 Eq. (15.4) and Corollary 15.5 allow us to forecast the robot’s behavior and see whether our robot may get into emotional stupor. Reasoning from the things said above we can state that the ‘resolute’ or ‘purposeful’ robot is a machine for which an alternate selection angle never equals 4  , or Eq. (15.4) never hol ds true, or its memory coefficients are anti - stuporous, so that this machine does not get stuporous regarding all the components of the education vector. 16. GENERALIZATION OF ROBOT’S EMOTIONAL BEHAVIOR RULES IN CASE THE NUMBER OF PLAYERS INTERACTING WITH THE ROBOT IS ARBITRARY (NOT SPECIFIED) 16.1. FIRST RULE OF ALTERNATE SELECTION Assume a robot is effected by n players nonsimultaneously. Suppose they initiate only positive emotions and the robot performs an absolute emotional memory i.e. its memory co efficients j i ,  satisfy the identity , 1 ,  j i  where , , 1 j m i  n j , 1  . Correspondingly, j m is the quantity of subject effects of the j - th player. At th e time point k t , 1 (with 1 , 1 m k  ) t he first player initiates the emotion k M , 1 causing the elementary education   k t k k d M R , 1 0 , 1 , 1 ) (   and education ) 0 ,..., 0 , ( 1 1 1      n R B with     1 1 , 1 0 , 1 1 ) ( m l l t l d M R   . At the same time all the rest 1  n players initiate zero emotions. At k i t , where i m k , 1  , 1 , 1 , k i k i t t  , with 1 i i  and 1 1 , 1 i m k  the player i initiates the emo tion k i M , causing the elementary education   k i t k i k i d M R , 0 , , ) (   and education  ) 0 ,..., 0 , , 0 ,..., 0 ( ýëåìåíò i i i R B   where     i m l l i t l i i d M R 1 , 0 , ) (   . At the same time all the rest 1  n players initiate zero emotions. Let us introd uce the general education vector ) ,..., , ( 2 1 n R R R V  where components are sum educations (resulting from all the players’ subjects) obtained in the full course of the effect time t , with      n l l m k k l t t 1 1 , . 65 With these design ations introduced, the rule of deciding in favor of this or that player can be formulated as follows: the emotional decision is made in favor of a player for which ) , ( min i B V  is reached with n i , 1  (this emotional decision is made in favor of the player i ). In case the minimum is reached under several i simultaneously, the emotional selection is not supposed to be performed and the decision is not made. The given rule can be generali zed in case the player’s effect initiates not just a single emotion, but a full vector of emotions. Thereby at k i t , with i m k , 1  , 1 , 1 , k i k i t t  where 1 i i  and 1 1 , 1 i m k  the player i initiates the robot’s emotion vector ) ,..., ( , 1 , , 1 r k i k i k M M M  which entails the vector of elementary educations    k i t j k i j k i r k i k i k i d M R R R R , 0 , , , 1 , , ) ( ), ,..., (   and the education  ) 0 ,..., 0 , ,..., , 0 ,..., 0 ( ) 0 ,..., 0 , , 0 ,..., 0 ( 1 r i i ýëåìåíò i i i R R R B    with     i m l l i t j l i j i d M R 1 , 0 , ) (   . At the same time all t he rest players 1  n initiate zero emotions. In this case the general education vector takes the form: ) ,..., ,..., ,..., ( ) ,..., , ( 1 1 1 1 2 1 r n n r n R R R R R R R V   Further reasoning are quite the same as those ones given above concerning the case when the player’s effect i nitiates one robot’s emotion. 16.2. SECOND RULE OF ALTERNATE SELECTION The Second rule of alternate selection is based on comparison of moduli of vectors of the sum educations i B with n i , 1  . This rule can be re - form ulated as follows: the emotional decision is made in favor of a player for which i B max is reached with n i , 1  (this emotional decision is made in favor of the player i ). In case the maximum length is reached under several values of i simultaneously, the emotional selection is not supposed to be performed and the decision is not made. 16.3. ORTHOGONALITY OF EDUCATION VECTORS AND EQUIVALENCE OF ALTERNATE SELECTION RULES As it was mentioned above, in this paper we use Cartesian rectangular coordinates. According to Theorem 15.2 two vectors which do not have common nonzero coordinates are orthogonal. Thus each pair of vectors n B B ,..., 1 is orthogonal. Theorem 16.1. The First and Second rules of alternate selection are equivalent to each other. 66 Proof . Let ) , ( i i B V    , 2 0     i , n i , 1  is the angle between V and i B . According to the ru les of vector algebra and orthogonality of n B B ,..., 1 the following relation holds true: V B i i   cos . Obviously, if ) , ( min i B V  is reached under k i  , then according to the First rule of alternate se lection the decision is made in favor of the player k . At that from the formula given above it follows that j k B B  with k j  . Thus i B max is reached under k i  . T he last one describes the Second rule of alternate selection. On the other hand if i B max is reached under k i  then according to the Second rule of alternate selection the decision is made in favor of the player k . At that j k B B  with k j  holds true, and following the formula given above we get j k    with k j  . From the things stated above we conclude that ) , ( min i B V  is reached under k i  , and this is according to the First rule of alternate selection. This completes the proof. 17. EMOTIONAL SELECTION AND CONFLICTS BETWEEN ROBOTS It is not difficult to see that Eq. (15.4) coincides completely with Formula (3.7) obtained while describing a conflict between two robots with equal tantamount emotions. This fact makes us conclude that inner emotional conflicts of a robot can be described by the same formulas as conflicts between different robots, a nd consequently, theories applicable for groups of robots can be successfully used for inner emotional conflicts of a single robot without any alterations. As an example of this we present the following theorem. Theorem 17.1 . If two uniformly forgetful r obots have the same (equal) tantamount emotions, then there are such robot memory coefficients that the robots never get into education conflict. Proof . For conflicting robots Eq. (3.7) holds true; if tantamount emotions are equal (3.7) is transformed into Eq. (15.4). According to Theorem 15.5 there exist anti - stupor coefficients transforming Eq. (15.4) to a strict inequality. But at the same time these anti - stupor coefficients are the memory coefficients of two different robots, and moreover, with these co efficients robots would never get into conflict. This completes the proof. 67 Logic makes us introduce a new definition. Definition 17.1 . Anti - conflict memory coefficients are memory coefficients of two different robots under which the robots never get int o conflict. Now it is time to give the following theorem. Theorem 17.2 . Anti - conflict memory coefficients of two uniformly forgetful robots with equal tantamount emotions coincide with anti - stupor coefficients. Proof is analogous to the one of Theorem 17. 1. Corollary 17.2 . When the conditions of Theorem 17.2 are valid, then the memory coefficients of two robots 1  = 2 1 and 2  = 3 1 are anti - conflict. Proof. According to Corolla ry 15.5, anti - stupor coefficients satisfy the equalities 3 1 , 2 1 2 1     . By virtue of Theorem 17.2, anti - stupor coefficients are anti - conflict ones. So, the corollary is proved. 18 . DIAGNOSTICS OF EMOTIONAL ROBOT’s “MENTAL DISEASES” Let us rec all the definition of robot’s emotion given in the beginning of the book for better understanding of this chapter. Definition 1.3 . The robot’s inner emotional experience function M(t) is called an ‘emotion’ if it satisfies the following conditions: 1. Func tion domain of M(t) :   0 , , 0 0 0   t t t ; 2. * 0 t t  (note that this condition is equivalent to emotion termination in case the subject effect is either over or not over yet); 3. M(t) is the single - valued function; 4. 0 ) 0 (  M ; 5. 0 ) ( 0  t M ; 6. M(t) is the constant - sign function; 7. There is the derivative dt t M d ) ( within the function domain; 8. There is the only point z within the function domain, such that 0 , 0 t z z   and 0 ) ( /   z t dt t M d ; 9. 0 ) (  dt t M d with z t  ; 68 10. 0 ) (  dt t M d with z t  . Let us introduce a couple more definitions. Definition 18.1 . Let us consider a robot to be “healthy” if its inner emotional e xperience function is an emotion. Definition 18.2 . Let us consider a robot to be “ill” if its inner emotional experience function does not satisfy at least one of the conditions in the definition of emotion. This definition allows us to introduce such co ncept as seriousness or severity of a robot’s disease. Since Definition 1.3 includes 10 conditions defining a disease, then the degree of severity of this disease is characterized by H taking on integral values from 1 till 10 to indicate a number of cond itions which do NOT hold true (as those are the conditions under which the inner emotional experience function becomes an emotion) . The more severe is the disease the greater is Н . Definition 18.3 . The vector X of disease symptoms is a vector with the num bers of emotion conditions (given in Definition 1.3) which do not hold true. Definition 18.4 . A robot’s disease with the symptom vector 1 X is a special case of a robot’s disease with the symptom vector 2 X if all th e elements of the symptom vector 2 X occur among the elements of the symptom vector 1 X . Below we give examples of robots’ diseases. 1. Let us take some inner emotional experience function f(t) satisfying all the conditi ons of becoming an emotion except Condition #2, i.e. the function differs from an emotion and this is described by the relation * 0 t t  . Obviously, in this case the disease severity degree is equal to 1. We consider that a robot having such an emotional experience function is neurasthenic . It is also obvious that for neurasthenia the disease symptom vector has the form X =(2). 2. Let us take some inner emotional experience function f(t) satisfying all the conditions of becoming an emotion excep t Conditions #2, 5, 8, 10. 2 ) ( t t f   is a good example of such a function. Obviously, in this case the disease severity degree is equal to 4. A robot which emotional experience function differs from an emotion regarding Conditions #2, 5, 8, 1 0 is psychopathic . For psychopathy the disease symptom vector has the form Х =(2, 5, 8, 10). 69 The forms of vectors in these examples make us conclude that symptoms of neurasthenia and psychopathy have one thing in common, which is Condition #2 unsatisfied, and, according to Definition 18.4 psychopathy is a special case of neurasth enia. Sometimes one unsatisfied condition of the emotion definition implies that some other conditions get unsatisfied, too. Let us consider the inner emotional experience function which has the form:   0 0 , 0 , 0 , , 2 1 sin ) ( t t P const P P t t P t f             . (18.1) At first sight Function (18.1) does not satisfy only Condition #5, and the disease severity degree is equal to 1 and the symptom vector containing only one element has the form Х =(6). But it is not correct. Applying mathematical analysis we can conclude that if Condition #6 is unsatisfied it implies that Conditions # 4, 5, 7, 9, 10 are not satisfied for the function f(t), as well. I.e. the disease severity degree is equal to 6, an d the symptom vector satisfies the relation Х =(4, 5, 6, 7, 9, 10). The example illustrating Formula (18.1) demonstrates the method (based on mathematical analysis) for detection of the major symptom of an emotional robot disease. Elimination of this sympt om directly implies that all the rest conditions become valid and satisfied. Thus for Function (18.1) the major reason of a rather severe disease is that Condition #6 remains unsatisfied. 19. MODELS OF ROBOT’s AMBIVALENT EMOTIONS Suppose we have the robo t’s emotion vector ) (   M defining ambivalent emotions. This vector takes the form   ) ( ),..., ( ) (    j n j i j M M M   with: n the quantity of displayed emotions in the robot’s ambivalent emotion,  the current time of the emotion effect. If the education goal is known and it is defined by   n A A A ,..., 1   where    n i const A i , 1 , then the value of goal achievement extent  of the education process is specified by the following equality:      n i i n i j i i A t R A t 1 2 1 ) ( ) (  , (19.1) with: ) ( t R j i the robot’s education obtained as a result of effect of the i - th emotion (at that ) ( ) ( ) ( ) ( 1 t t t t R r R j i j i j i j i     ), ) ( t j i  the memory coefficient satisfying the 70 relation   1 , 0 ) (  t j i  , j the order number of an education time step, t the time of the education process, ) ( t r j i the elementary education sati sfying the relation       0 ) ( ) ( d M r j i j i ,     1 i t t . Differentiating (19.1) with respect to t , we obtain      n i i n i j i i A dt t dR A dt t d 1 2 1 ) ( ) (  . (19.2) According to Chapte r 2 the sum emotion ) ( t V j i satisfies the relation ) ( ) ( ) ( ) ( ) ( ) ( ) ( 1 1 t dt t dR dt t d t R dt t dr dt t dR t V j i j i j i j i j i j i j i         . (19.3) It easy to see that for the robot with an absolute memory this formula is equivalent to ) ( ) ( ) ( t M dt t dR t V j i j i j i   . So, Eq. (19.2) takes the form      n i i n i j i i A t V A dt t d 1 2 1 ) ( ) (  . (19.4) Modern psychologists believe that an emotion is positive if it mak es an entity (a person or a robot) to approach its preset goal. Thus if 0 ) (  dt t d  holds true then the ambivalent vectorial emotion is positive; if 0 ) (  dt t d  holds true then this ambivalent emotion is negative; if 0 ) (  dt t d  holds true then it has no sign. But modern vector algebra in the general case does not operate with such terms as “positive” or “negative” vectors. Therefore let us advanced a hypothesis that there is a unified characteristic for ambivalent emot ions of the vector which specifies a sign of the ambivalent vectorial emotion. Obviously, this characteristic is a sign of the value dt t d ) (  . Let us introduce series of definitions. 71 Definition 19.1. The average function   ) ( t f of robot’s inner emotional experience is the function of the form        n i i n i j i i A t V A t f 1 1 ) ( ) ( (19.5) under the stipulation that 0 1    n i i A ,   0 0 , , 0 t t t  is the minimum value of all the time steps of component emotions of the ambivalent emotion vector. Thus the average function of robot’s inner emotional experience represents a special function, such that when this function is substituted for all t he sum component emotions in the ambivalent emotion vector we get the value dt t d ) (  equal to the value of the function without this substitution. I.e.            n i i n i i n i i n i j i i A t f A t V A A 1 2 1 1 2 1 ) ( ) ( holds true for t his substitution Definition 19.2 . The average emotion   ) ( t M is an average function of inner emotional experience which appears to be an emotion. Definition 19.3 . If an average function of inner emotional experience is not an emotion the n a robot is considered to be mentally ill and an ambivalent emotion causes the disease. Definition 19.4 . The average elementary education   D is a value satisfying the relation       0 0 ) ( t d M D   . Definition 19.5 . The average ed ucation   R is a value specified by the formula        n i i n i j i i A R A R 1 1 with 0 1    n i i A . 72 Definition 19.6 . The prevailing emotion ) ( t M k in the ambivalent emotion vector is an emotion for which its order n umber k in the vector of ambivalent emotions implies that     D r D r j i n i j k      , 1 min is satisfied. Definition 19.7 . The prevailing elementary education is the elementary education corresponding to the pr evailing emotion. Obviously for each current time step j of the robot’s education there can be its average function of inner emotional experience, average emotion, prevailing emotion, prevailing elementary education, average elementary education, average education and value characterizing an ambivalent emotion sign. Let the emotion vector   n i t M j i , 1 ), ( of the vector of ambivalent emotions have the form   0 0 , 0 , , 1 , 0 , sin ) ( t t n i const P t t P t M j i j i j i                . (19.6) Now let us prove t he theorem. Theorem 19.1 . If 0 1    n i j i i P A then for the robot with an absolute memory the average function of inner emotional experience satisfying Eqs (19.6) is an emotion. Proof . It is quite easy to see that with (19.6) valid the value   ) ( t f satisfies the following relation:   . sin ) ( 1 1              t t A P A t f o n i i n i j i i  (19.7) Obviously (19.7) satisfies the definition of emotion . Quod erat demonstrandum. Theorem 19.2 . If    n i i A 1 0 then the sign of the average emotion coincides with the sign of the ambivalent emotion Its Proof becomes obvious when we compare Formulas (19.4) и (19.5). 73 Theorem 19.3 . If    n i i A 1 0 then the sign of the average emotion a nd the sign of the ambivalent emotion are opposite. Its Proof is analogous to the proof of Theorem 19.2. 20. ABSOLUTE MEMORY OF ROBOTS Let us consider robots with memory coefficients satisfying the eqs.    n i i , 1 , 1  , where n is the number of time steps in the education process. Obviously, in this case the education n R is defined by the formula    n i i n r R 1 , (20. 1) where i r is the elementary education corresponding to the i - th time step. According to Eq. (20.1), the infinite education process R can be described by the equality        1 lim i i n n r R R . (20.2) Now let us formulate the following theorems. Theorem 20.1 . An infinite education process based on tantamount emotions for the robot with an absolute memory diverges. Proof . Since the emotions are t antamount, then the equalities.     , 1 , i q r i are valid. By virtue of them Relation (20.2) takes the form           n q q R n n i n lim lim 1 . The theorem is proved. Theorem 20.2 . If an infinite education process converges, then elementary educations which this process is based on tend to zero with an infinite increase in the number of time steps. Proof . Since the education process converges, then the inequality      | | 1 i i r holds true. Consequently, 0 lim    i i r . The theorem is prov ed. Note one more thing: the education process convergence corresponds to the education satiety presence under an increase in the number of time steps. Taking this into account we can rephrase Theorem 20.2 as follows: if an education process is 74 satiated, then the elementary education in the basis of this process tends to zero with an infinite increase in the number of time steps . In Chapter 1 we gave an example of an emotion which can be described by the function          t t P t M 0 sin ) (  , where P=const , 0 t is the time step length. Similarly to this example, we define emotions corresponding to the i - th time step by          t t P t M i i i 0 sin ) (  , (20.3) where 0 , i i t const P  is the length of the i - th time step,    , 1 i . It is easy to see that the elementary education i r corresponding to Emotion (20.3) satisfies the equality 0 2 i i i t P r   . So, by virtue of T heorem 20.2, if an education converges, the equality 0 lim 0    i i i t P has to be valid, and this is the necessary convergence condition. Let us prove the following theorems. Theorem 20.3 . If 0 lim 0    i i t , then 0 lim    i i r . Pro of . According to Definition 1.3,    L P i is valid. Consequently, the chain of relations 0 lim lim lim 0 0          i i i i i i i t L t P r holds true. This completes the proof of Theorem 18.3. Theorem 20.4 . If 0 lim    i i P , then 0 lim    i i r . Proof . According to Definition 1.3,    S t i 0 is valid. Consequently, the chain of relations 0 | lim | lim lim 0          i i i i i i i P S t P r holds true. This completes the proof of Theorem 20.4. As is obvious from the foregoing, the condition necessary for education convergence is satisfied if 0 lim 0    i i t , or 0 lim    i i P , or 0 lim 0    i i i t P holds true. 75 The following statement is obvious as well: if there are limits of 0 i t and i P under an infinit e increase in the number of time steps., and 0 lim 0    i i t and 0 lim    i i P hold true, then the education process is divergent. The theorems proved above direct us to one of the way of designing robots with an absolute memory and with out education satiety. E.g., in order to develop this kind of robots it is enough just to select the sequences of amplitudes i P and the time steps 0 i t such that their limits under an infinite increase in the number of time steps i are nonzero. According to Theorem 20.1, an example of a divergent education is the education with tantamount emotions, i.e. when the conditions        , 1 , , 0 0 i const t t const P P i i hold true. To build a robot with a satiated education one may select the predetermined convergent series as the infinite education, then on its basis define the sequences    , 1 , , 0 i t P i i satisfying Definition 1.3 and the statements of Theorems 20.3 or 20.4, and then based on this selection preset the emotions for each of time steps by Formula (20.3). Based on Chapter 3 we can state that an education conflict between two robots with an absolute memory by the time point t occurs if the following conditions are satisfied:            j k k i k k j k k i k k t r r 1 ] 2 [ 1 ] 1 [ 1 ] 2 [ 1 ] 1 [ ,   , (20.4) where ] 2 [ ] 1 [ , k k r r are the elementary educations of the first and the second robot, ] 2 [ ] 1 [ , k k   are the corresponding education time steps of these robots. Let the robots get their educations based on tantamount emotions with the corresponding elementary educations ] 1 [ 0 r and ] 2 [ 0 r . Consequently, reasoning from conflict relations (20.4) we obtain ] 2 [ 0 ] 1 [ 0 jr ir  , i.e. conditions for two robots to get into conflict at the t ime point t takes the form        j k k i k k t r r j i 1 ] 2 [ 1 ] 1 [ ] 1 [ 0 ] 2 [ 0 ,   . (20.5) If const const k k     ] 2 [ ] 2 [ ] 1 [ ] 1 ,     , (20.6) hold true, then Relations (20.5) are equivalent to the formula 76 ] 1 [ ] 2 [ ] 1 [ 0 ] 2 [ 0     r r j i , which defines the conditions for an education conflict to start between two robots with an absolute memory under tantamount emotions for each robot and equal time steps of these emotions. For emotions given by Formula (20.3) and co nsidering Eqs. (20.6) we get the following equality: 0 ] 1 [ 0 ] 2 [ 0 ] 1 [ ] 1 [ 0 ] 2 [ ] 2 [ t t t P t P j i   , (20.7) where 0 ] 2 [ 0 ] 1 [ ] 2 [ ] 1 [ , , , t t P P are the amplitudes of emotions and the values of time steps of the first a nd the second robot, correspondingly. From (20.7) it is evident, that in this case the conflict between robots emerges only when the conditions ] 2 [ ] 1 [ P P  and 0 ] 1 [ 0 ] 2 [ t t j i  are valid. We want to dwell on the relations determining f ellowship (see Chapter 4) of robots with an absolute memory. In this case (under tantamount emotions) a number of education time steps necessary for achieving fellowship (concordance) between two sub - groups with equal fellowship values can be found by solv ing the following problem : solve for   0 0 1 min P R jq j    , (20.8) under 0 0 0    P R jq . It is easy to see that this problem always has a solution, what means that robots with an absolute memory at any time can be brought to fellowship with any fellowship value preset. Now let us solve the problem of developing the equivalent educational process (see Chapter 5) for robots with an absolute memory. Obviously, in order to define the elementary education value q corresponding to the equivalent process, we have to solve the following problem: solve for:   2 2 1 ) 1 ( min ) ( min       n j j q q q j R R q J . (20.9) 77 Problem (20.9) can be red uced to solving the equation 0 ) (  dq q dJ which expanded form is          n j j j q j R R 2 1 0 ) 1 ( ) 1 ( . (20.10) It is easy to see that the solution q of Eq. (20.10) is defined by            n j n j n j j j j R j R q 2 2 2 2 1 ) 1 ( ) 1 ( ) 1 ( . 21. ALGORITHM OF EMOTIONAL CONTACTS IN A GROUP OF ROBOTS In this chapter we suggest a rule of mutual contacts between robots in their group. In Chapter 2 we showed that the robot’s education i R by the end of the i - th time step is specified by the formula 1    i i i i R r R  , (21.1) where i  is the robot’s memory coeffici ent which characterizes memorization of the education 1  i R by the end of the i - th education time step. Suppose robots contacting each other in a group randomly exchange emotions which initiate elementary educations. Let   L i R be the education of the L - th robot by the end of the i - th time step, and also let   L i r be the elementary education corresponding to this time step. Similarly, let us introduce the corresponding educations   j i R and   j i r . for the j - th robot. Assume both robots are effected by the subject S(t) initiating emotions ] [ L i M (robot L ) and ] [ j i M (robot j ). Let us consider that if 0 ] [ 1 ] [ 1    j i L i R R is valid then 0 ] [ ] [  j i L i M M holds true, and the formula 0 ] [ 1 ] [ 1    j i L i R R implies 0 ] [ ] [  j i L i M M . The emotions ] [ L i M and ] [ j i M initiate the elementary education   L i r and   j i r correspondingly, and     i t L i L i d M r 0 ] [ ) (   and     i t j i j i d M r 0 ] [ ) (   , where i t is the 78 length of the i - th time step. Obviously the sign of the elementary education equals to the sign of the emotion generating this education, and vice versa. Let us assume that the sign of the education by the end of the i - 1 st time step is equal to the current sign of the emotion during the i - th time step and the elementary individual education by the end of this time step. Now let us introduce the following definition. Definition 21.1 . The suggestibility coefficient   L j i k , is the value permitting the emotion i of the robot L to be replaced by the corresponding emotion of the robot j multiplied by the value of this coefficient, if     j i L j i L i r k r , ] [  with   0 ,  L j i k . It is obvious that   1 ,  j j i k . Assume that when two robots come in contact and start communicating, the education of each of them (according to Formula (2.1)) satisfy the relations ] [ 1 ] [ ] [ L i i L i L i R r R      , ] [ 1 ] [ ] [ j i i j i j i R r R      , with:                ] [ ] , [ ] [ ] [ ] [ ] [ ] , [ ] [ } [ ] , [ ] [ ] [ ] , [ ] [ ] [ , max , , max , , max j i L j i L i L i L i j i L j i L i L i L j i j i j i L j i L i L i r k r r if r r k r r k if r sign r k r r ,                ] [ ] , [ ] [ ] [ ] [ ] [ ] , [ ] [ } [ ] , [ ] [ ] [ ] , [ ] [ ] [ , max , , max , , max L i j L i j i j i j i L i j L i j i j i j L i L i L i j L i j i j i r k r r if r r k r r k if r sign r k r r , ] , [ L j i k the suggestibility coefficient of j - th robot’s emotions to the robot L , ] , [ j L i k the suggestibility coefficient of L - th robot’s emotions to the robot j , 0 ] , [  L j i k , 0 ] , [  j L i k . Let us introduce the following definitions. Definition 21.2 . With   ] [ ] , [ ] [ } [ ] , [ , max j i L j i L i L i L j i r k r r k  satisfied the j - th ro bot is called the agitator. Definition 21.2 . Re - education (re - bringing) of a robot is a sign reversal of the robot’s individual education. Obviously, signs of individual educations of robots in a group can reverse only if there are both robots with oppo sitely signed educations and robots - agitators. 79 According to Theorem 3.1 proved in Chapter 3, a conflict in a group occurs only if the sum education of this group equals to zero. Based on this we worked out the following theorem. Theorem 21.1 . A conflict i n a group of robots can occur only if initial educations of these robots are oppositely signed and if there are agitators in this group. This opens a way to software modeling of an emotional behavior of a closed group of intercommunicating robots. The in put parameters of the corresponding software for modeling are supposed to be memory coefficients of each of the robots in this group, their initial individual educations and paired suggestibility coefficients. As the software runs the emotions of robots ar e initiated at random and so occur the corresponding elementary educations due to random contacts of robots. As a result we may obtain the computed sum education specifying conflicts in the group, as well as individual educations of each robot in this grou p. Due to numerical experiments it is possible to find critical values of suggestibility coefficients and memory coefficients causing conflicts in the group of robots after several paired contacts (contacts between two robots). An algorithm of a robots’ be havior in a group with a leader differs from an algorithm of a robots’ behavior in a group without a leader due to the fact that in the first case while selecting a robot - educator which is a major agitator in the group it is necessary to find the order num ber of the greatest value of robots’ individual educations. A robot with this number is supposed to act the part of a permanent agitator - and - leader. 22. ON INFORMATION ASPECTS OF E - creatures Currently U.S. researchers discuss the question concerning crea tion of an electronic copy of a human being which can be called an E - creature [1]. We tried to study this idea of our American colleagues in terms of information. Let us make a series of remarks: 1. There is no human being with an absolute memory, i.e. he \ she always forgets a part of perceived information as this is his \ her natural feature. 2. A human being is able to accumulate information – without forgetting immediately a part of it – by finite portions. Now let us give the following definitions: Definition 22 .1 . A portion is an amount of new information which is remembered completely by a human being. Definition 22.2 . An information time step is an arrival time of a portion. Let us note one obvious property of the portion: a number of bits i s in the portion i is limited, i.e. there is such q for which the inequalities 80      , 0 , 0 , i q q s i always hold true. Let us record the following formula according to the methods given in Chapter 2: i i i i S s S 1 1 1       , (22.1) with: i the number of the information time step,   n i , 0 ; 1  i s the i +1 st portion, 1  i S the total amount of information memorized by a huma n through i +1 information time steps, 1  i  the human information memory coefficient (characterizes the part of total memorized information which was received during the i previous information time steps). Obviously the human information memory coefficient corresponding to the end of the information time step satisfies the relation   1 , 0  i  where there is  with     , 0 , i i   ,   1 , 0   . By virtue of the information property, 0  i s holds true, consequently all the accumulated information is greater than or equal to zero. Suppose we have an electronic copy of a human created. Let us prove one of the information properties of this copy. Theorem 22.1 .The total inf ormation S which can be memorized by the processor of the human - like copy is limited. Proof . Applying the methods given in Chapter 2, portion properties and Eq. (22.1) we easily obtain the inequality        1 1 1 1 i i q S . (22.2) Proceeding to the limit in Ineq. (22.2) with an infinite increase of time steps (time of existence of an immortal human) we get the chain of relations                1 1 1 lim lim q q S S i i i i . Thus, the theorem is proved. Corollary 22.1 . It is impossible to create an E - creature with a nonabsolute memory which would be able to accumulate information infinitely. Its proof is evident from the formulation of Theorem 22.1. So, we can conclude that it is impossible to create the only infinitely existing E - creature which would be an evolving copy of a human being ( at least, in terms of information ). 81 An immortal (infinite) electronic creature able to accumulate information infinit ely [1] is possible only in case if it has an absolute information storage (information memory) with the conditions     , 1 , 1 i i  satisfied; but this sort of creature would have nothing to do with a human being, forgetful and oblivious; this sort of creature could be called just a robot unit. For the infinite information evolution of the E - creature with an absolute memory we can state that it is necessary that the information from a chip of the “ancestor” E - creature with the nonabsolute memory sh oul be downloaded to a chip of the “successor” E - creature (also with the nonabsolute memory) when the amount of the accumulated information becomes close to S . For the purpose of further data accumulation by the E - creature (which is a copy of a human being with a nonabsolute memory) it is necessary to re - download all the information from the ancestor’s chip to the chip of the successor on a regular basis, i.e. 0 s is supposed to be equal to k S with k the number of in formation time steps performed by the E - creature in the full course of its existence. Let us note one property of memory information coefficients varying during the information time step length t with   1 ,   i i t t t . Theorem 22.2 . 1 ) 0 ( 1   i  . Proof . Similarly to (22.1), let us write the formula i i i i S s S ) 0 ( ) 0 ( ) 0 ( 1 1 1       . (22.3) But at the initial moment of the information time step the relations 0 ) 0 ( , ) 0 ( 1 1     i i i s S S (22.4) hold true. Substituting (22.4) into Relation (22.3) and solving the obtained equation relative to ) 0 ( 1  i  we get 1 ) 0 ( 1   i  , which was to be proved. Let us define a linear dependence allowing approximately describe the change in the memory information coefficient during the information time step. Obviously, i i i i i S t s S ) ( 1 1 1 1        . Consequently, i i i i i i S s S t 1 1 1 1 1 ) (           . (22.5) holds true. Suppose that b at t i    ) ( 1  holds true. By Theorem 22.2 and Formula (22.5) the system of linear equations 1 ) 0 ( 1    b i  , (22.6) 82 b t t a t i i i i i         ) ( ) ( 1 1 1 1   . (22.7) holds true. Solving this system of equations (6) – (7) we get 1 , 1 1 1       b t t a i i i  . Thus we can write down the following formula , 1 1 ) ( 1 1 1 1          t t t S s S t i i i i i i  with   1 ,   i i t t t . It is easy to see that many proposition and provisions of the emotional r obot theory given in the previous chapters can be easily adapted to the aspects of data accumulation by the E - creature. We suggest that you, our dear reader, should do it yourself as some brain exercises for pleasure at your leisure. 23. SOFTWARE REALIZA TION OF SIMPLE EMOTIONAL ROBOT’s BEHAVIOR In order to illustrate the theory given in Chapter 2 let us set the task of developing software which would model the emotional behavior of a robot taking and responding audible cues (sounds) which are put in this software through a microphone plugged to a computer. Assume this computer program is to execute the following: according to a sound amplitude the program determines a type of “smile” which is outputted by a computer monitor as a response (reaction) to th e sound effect (so finally we will see different “shades” of sad or happy smiles). 23.1. INPUT PARAMETERS OF SOFTWARE Assume the modeled robot is uniformly forgetful. As the input parameters for the model implemented by this software we use the robot’s m emory coefficient  equal to some constant value from 0 to 1, and the time step. 23.2. ALGORITHM FOR MODELLING ROBOT’s MIMIC EMOTIONAL REACTION In this section we suggest an algorithm which helps to model the mimic emotional reac tion of the robot effected by a sound (audio signal). This algorithm represents a sequence of steps which would make a robot (software) emotionally react (mimic) to sounds produced by a human, animal, etc. Let us present this algorithm as the following se quence of steps with some explanations: 83 1. Convert analog sound signals received from a microphone, to a sequence of numbers representing momentary values of a signal amplitude. Analog - digital [A/D] converters are pretty suitable devices for such a purpose . And the conversion method itself is called the pulse - code modulation. 2. Collect data necessary for the following analysis. 3. Analyze and aggregate the collected data. 4. Reveal and evaluate the degree of the predefined emotional stimuli. In other word s, specify subject values effecting the robot (software). Predefined sound characteristics can be used as the emotional stimuli; sound characteristic data collection is to be done at Step 2 and 3. 5. Compute momentary emotional characteristics of the robot (software) on the basis of the emotion and education model considered in Chapter 2. 6. Compute elementary educations on the basis of momentary emotional characteristics. 7. Compute the education on the basis of elementary educations and the robot’s (softw are’s) memory coefficient which is to be preset before the algorithm is started. 8. Enjoy a visualization of the robot’s (software’s) emotions based on the computed education. Let us consider each step of the algorithm in more detail. Step 1. In order to g o through the 1 - st step we need an analog - digital converter. Every modern soundcard is usually equipped with it, so in order to get an access to it we need to interact with a soundcard driver. It can be fulfilled in a variety of ways, some of which we will consider below. Generally speaking, it is very important to setup the conversion itself, i.e. its characteristics. It is necessary to select and preset the sound sampling frequency, signal discreteness, number of channels and other characteristics. Step 2 . This step deals with data collection from the soundcard in the course of the pulse - code modulation. The data can be stored in a variety of ways, e.g. in files of different formats, or just store the internal data structure. However here we should take in to consideration that the data size (even if the interaction of the stimulant and the robot (software) is very brief) may grow pretty big. E.g. with the sampling frequency of 22050 Hz, discreteness of 8 bits, mono channel and 10 - second stimulant – robot in teraction, the robot (software) is supposed to receive 220500 bytes from the soundcard. Step 3. The data is analyzed and aggregated, i.e. some certain preset characteristics are computed on the basis of a whole data bulk or just a part of it. Step 4. The 4 - th step is matching, which means that on the basis of certain values of the computed characteristics evaluation of subjects’ values takes place. Correct matching is achieved experimentally. Steps 5 is similar to 4, only at this step the momentary emotiona l characteristics are matched to the degrees of the effecting subjects. Correct matching is achieved experimentally as well. Steps 6 and 7 imply computations based on the mathematical model formulas described in Chapter 3. 84 At the final step of the algorith m the robot’s emotion is to be expressed visually. This can be fulfilled by some of the ways of emotion visualization (e.g. a ‘smile’). Also we should note the following. If we want to develop an ‘interactive’ robot (software) i.e. the robot responding to sounds instantly then data collection and data processing are to be executed simultaneously. Thus the 2 - nd step of the algorithm is to be executed simultaneously to Steps 3 – 8. 23.3. SoundBot SOFTWARE ARCHITECTURE Let us examine an architecture of the d eveloped software SoundBot [11] implementing the algorithm given above (Fig. 23.1.). Figures in circles mean the steps of the algorithm. 85 Fig. 23.1. Architecture of SoundBot software It is easy to see that the architecture is directly associated with the algorithm given above. It includes two modules: 1. A sound module, which is responsible for interaction with the soundcard and collection of the necessary numerical data. Звуковой адаптер Аналого - цифровой преобразователь Цифро - аналоговый преобразователь Аналоговый сигнал Аналоговый сигнал Последовательность чисел Система SoundBot Модуль работы со звуком Модуль реализации модели Моментальные эмоции Воспитание Сюжеты Данные Анализ и агрегация Реакция системы 1 1 2 3 4 5 6,7 8 86 2. An implementation module, which is responsible for implementation of the given mathematical model of emotions and education, it also computes the smile parameters to show the mimic emotional response of the system. The data is processed, analyzed and aggregated directly between the modules. Both modules function simultaneously for the system to be interactive. Now let us examine main features , operation principles and visual interface of this software. 23.4. MAIN FEATURES OF SoundBot This software is written in C++ using Visual Studio 2008 development e nvironment. It works on IBM PC compatible computers under Windows XP and elder OS. The software also requires .NET Framework 2.0. The exe file size is 100 Kbytes. The major functions of this software are the following: 1. SoundBot is able to detect main capa bilities of PC multimedia devices. 2. SoundBot is able to play wav files. 3. SoundBot is able to record sounds in wav files (mono only). 4. SoundBot can perform an emotional response to the played wav files. SoundBot can emotionally response in an interactive mode to the sounds inputted via a microphone. 23.5. SoundBot OPERATION PRINCIPLES Major operation principles of SoundBot which are to be viewed in details are: 1. Sound module operation, 2. Principle of simultaneous operation of both modules. 3. Emotional stimu li considered by the software and principles of their degree assignment. As it was said before, there are a variety of ways for working with a soundcard. The methods considered above use system libraries of MS Windows, so these methods can be used only wi th this OS. The simplest approach is to use MCI command - string i nterface or MCI command - messages interface. MCI is a universal interface independent of hardware characteristics. MCI is meant for controlling multimedia devices (soundcards and videocards, CD - and DVD - ROMs) [12, 13]. 87 In most cases capabilities of this interface meet the needs of any multimedia applications used for recording and playing audio or video files. But it has a drawback: the data received from the soundcard cannot be read and pro cessed interactively. It means that this method will not work here. This approach is based on the MCI command - string i nterface or MCI command - message interface and the drawback of this method can be overcome if we use a low level interface. The low level i nterface can be used for playing wav files as follows. First, the wav file header is read and its format is checked, the output device is opened and the sound data format is specified. Next, the audio data blocks are read directly from the wav file to get prepared by a special function for output and then they are passed to the driver of the output device. The driver puts them out to the soundcard [12, 13]. The application totally controls the playback process because it prepares the data blocks in RAM itse lf. The audio data is recorded the same way. First, the input device is to be opened and the audio file format is to be specified to the device. Next, one or more blocks of RAM are to be reserved and the special function is to be called. After that, as the need arises, the prepared blocks are passed to the input device driver which fills them with the recorded audio data [12, 13]. For the recorded data to be saved as a wav file the application has to generate and record the file header and audio data to the file from the RAM blocks prepared and filled by the input device driver. The low level interface requires all the record - and - playback details to be very thoroughly considered, as opposed to the MCI interface where most of parameters are just taken by defa ult. These extra efforts are compensated with pretty good flexibility and the opportunity to work with the audio data in real time [12, 13]. To provide the interactive mode of the SoundBot, i.e. make it interact with a user in real time, its modules have t o operate simultaneously. Each SoundBot’s module is executed as a separate thread and it makes possible the following: 1. The software can simultaneously receive new data from the soundcard and analyze it for further computing of the education which reflects the emotional state. 2. The software can simultaneously play, record and select the audio data for its analysis. Besides, the visualization of mimic emotional response is also executed as a separate thread to make it drawn as fast as possible. Still the So undBot considers only one emotional stimulus (subject) which is amplitude of the effecting audio signal. Every audio signal count generates stimulations in the SoundBot system and initiates momentary emotions according to the sine - shaped emotion function. Subjects are matched to emotions by value ranges 88 specifying what subject initiates positive emotions and what subject initiates negative ones. 23.6. SoundBot VISUAL INTERFACE A main window includes two inlays: the first one deals with playback and traini ng of the SoundBot system on .wav samples (Fig. 23.2). Fig. 23.2. First inlay in the main window of the SoundBot software. The second inlay is used for recording wav files and interactive communication with the SoundBot system (Fig. 23.3). Fig. 23.3. Second inlay in the basic window of the SoundBot software. Besides, the main window shows a smile expressing the emotional response of the modeled robot and the current value of the momentary emotion and education. 89 In a main menu we may set the major par ameters (parameters of the emotion math model, parameters of operation principles and parameters of audio data processing). Below we show a couple of dialogue windows for setting up different parameters (Fig.23.4 and 23.5 ). Fig.23.4. Model parameters Fig . 23.5 . Record parameters To find out characteristics of pulse - code conversion supported by the soundcard we are to select the option “Info” – > “Driver parameters...” of the main menu (this is strongly recommended for the correct record parameters setting s especially when the software is run for the first time). After you submit the settings you will see a window containing the description of multimedia hardware (Fig. 23. 6 ). 90 Fig. 23.6 . Multimedia hardware Parameters The suggested algorithm can be used for building emotional robots. But the input audio data should be analyzed more thoroughly to single out as much stimuli as possible. That is why the SoundBot system can be considered as the first approximation of emotional robot software. Also it sho uld be considered that both the algorithm and Soundbot itself are meant for interaction with only one user. Interaction with several users requires some other much more complicated mathematical model. The described software can be applied, for instance, fo r proper communication and rehabilitation of hearing - impaired patients, or used by actors for placing a voice outside an opera house. This software can also be used for predicting the emotional reaction of other people to the user’s behavior (the software response shows the possible reaction of the surrounding people). 91 CONCLUSION We hope you managed to read this book through. The authors made an attempt to build up and describe the virtual reality of emotional robots. Concerning real mental p rocesses of living organisms, it is not easy to define dependencies between emotions and time, and, perhaps, in the general case, this problem is unsolvable. But in the process of building robots a roboticist can preset mathematical functions of emotions altering with time (same as memory coefficients, and derivatives of emotion functions). In this case the theory given in this book allows of designing robots with the preset psychological characteristics, with further analyzing and computing of emotional b ehavior of robots on the basis of numeric data read in their memory. As an example, below we give a description of a closed chaotic virtual reality of emotional robots based on software implementation of mathematical models shown in this book. In this desc ription we use the terms defined above. Let the virtual reality include some finite number of robots. Each of robots has its own memory with its special individual memory coefficients. In their virtual reality robots effect upon each other with different s ubjects in a random way to initiate emotions and alter each other’s educations. Robot the educator (the one from which emotions are passed to the educatee) is that with the greatest education modulo. Concordance groups – ‘fellowships’ of robots occur as a result of emotional contacts between robots, the greater their fellowship value, the more united is the group. Some groups may get into conflicts with each other. These conflicts emerge when sum educations of the groups became equal to zero. Each robot ha s a goal which is common for their reality in a whole. As a result of this goal presence in the course of time the leaders may appear which are robots with the greatest willpower and best abilities. Education effectiveness of each robot is characterized by the education process efficiency coefficient. Finding their efficiency coefficients can help us to select robots with natural characteristics making them the most educationally inclined. Some of robots feature satiated education; when these robots get to some certain level of satiety, emotional effect of other robots upon them stops. If there are robots which do not have education satiety in this virtual reality, then other robots educate them in the most active way, and this causes leaders to occur in th e robots’ community. Based on equivalent processes developed for each of robots with further ranking of limit educations, a leader of the robots’ community defines its distant successor to be a new leader in future. The robots may get ill due to some softw are faults or computer virus attacks. A physician in this robots’ community heals its ill inhabitants by correcting their emotions. As robots - members of this community keep communicating and interacting with each other their educations alter with the cours e of time. This causes the leaders to change and new fellowships and conflicting groups to occur. This is the way emotional robots live in their virtual reality. This book appeared as a result of investi gations described in [3, 11 – 33 ], it includes new re sults and prepares a basis for new problems. 92 We hope that this book is useful for roboticists and program developers designing software for emotional robots and their groups. Any your ideas and opinions about this book are welcome. Please feel free to e - mail to the authors at ogpensky@mail.ru or kirillperm@yandex.ru . REFERENCES 1. Bolonkin A.A. - URL: http://Bolonkin.narod.ru . 2. Дружинин В.Н. Экспериментальная психология/ В.Н.Дружинин. – СПб.: Питер, 2004. – 320 с. 3. Пенский О.Г., Зонова П.О., Муравьев А.Н. и др. Гипотезы и алгоритмы математической теории исчисления эмоций: монография; под общ. ред. Пенского О.Г./О.Г. Пенский, П.О. Зонова, А.Н. Муравьев, Ю.С. Ожгибесова, А.А. Проничев, В.Л. Чечулин – Пермь: изд - во Перм.гос.ун - та, 2009. – 152с. 4. Бреслав Г.М. Психология эмоций/ Г.М. Бреслав. – М.: Смысл: Академия, 2004. – 544 с 5. Андреева Е.В. Математические основы информатики/ Е.В. Андре ева, Л.П. Босова, И.Н.Фалина. – М.: Бином. Лаборатория знаний, 2005. – 238 с. 6. Брандон В. Молекулярная физика и термодинамика/ В.Брандон, А.Н. Волкова. – М.: Изд - во ун - та дружбы народов им. Патриса Лумумбы, 1975. – 295 с. 7. Замков О.О. Математические методы в экономике/ О.О. Замков, А.В. Толстопятенко, Ю.Н. Черемных. – 4 - е изд., стереотип. – М.: Дело и сервис, 2004. – 368 с. 8. Окулов С. Программирование в алгоритмах/ С.Окулов. – М.: Бином. Лаборатория знаний, 2002. – 341 с. 9. Мандель И.Д. Кластерный анализ/ И.Д. М андель. – М.: Финансы и статистика, 1988. – 224 с. 10. Кочин Н.Е. Векторное исчисление и начала тензорного исчисления. – 9 - е изд./ Н.Е. Кочин – М.: Наука, 1965. – 350 с. 11. Пенский О.Г. Математические модели эмоциональных роботов: монография/ О.Г. Пенский – Перм ь: Изд - во Перм.гос.ун - та, 2010. – 192с. Pensky O.G. Mathematical models of emotional robots/ O.G. Pensky – Perm: Perm State University, 2010 – 192p. 12. Муравьев А.Н. Математическая модель силы воли. // А.Н. Муравьев, О.Г. Пенский//Конференция «Искусственный и нтеллект: Философия. Методология. Инновации». Материалы III Всероссийской конференции студентов, аспирантов и молодых ученых. Г.Москва, МИРЭА, 11 – 13 ноября, 2009г. – М.: «Связь - Принт», 2009. – С.338 – 339 . 13. Пенский О.Г. Математические способы и алгоритмы формирования эмоциональных групп/О.Г. Пенский, П.О. Зонова, А.Н. Муравьев //Вестник Пермского университета. Математика. Механика. Информатика. – No7(33), 2009. – Пермь: Изд - во Перм.гос.ун - та – С.53 – 56. 93 14. Пенский О.Г. Математические модели эмоционального воспитания/ О.Г. Пенский//Вестник Пермского университета. Математика. Механика. Информатика – No7(33), 2009. – Пермь: Изд - во Перм.гос.ун - та. – С.57 – 60. 15. Пенский О.Г. О математическом подходе к расчету некоторых эмоц иональных характеристик/ О.Г.Пенский, С.В.Каменева // Электронный журнал “Исследовано в России”, 228, 2183 – 2188, 2006. URL : http // zhurnal . ape . relarn . ru / articles /2006/228. pdf – М: МФТИ. 16. Пенский О.Г. Основные определения общей математической теории эмоций/ О .Г.Пенский, С.В.Каменева// Материалы Междунардной научно - методической конференции, посвященной 90 - летию высшего математического образования на Урале, «Актуальные проблемы математики, механики, информатики». Пермь, 2006. – С.128 – 129. 17. Пенский О.Г. Программа вычисления оптимальной пиар - стратегии при проведении предвыборных мероприятий “ Piar ” /О.Г.Пенский., С.В.Каменева. Программа. Свидетельство об отраслевой регистрации разработки No7134 от 31.10.2006г. 18. Пенский О.Г. Программа выявления конфликтных групп в колле ктиве и вычисления воспитания членов коллектива “ Group ” /О.Г.Пенский, С.В.Каменева. Программа. Свидетельство об отраслевой регистрации разработки No6869 от 8.09.2006г. 19. Пенский О.Г. Применение математической теории эмоций в формировании студенческих групп// О.Г. Пенский, П.О. Зонова// Материалы Международной научно - методической конференции «Университет в системе непрерывного образования», 14 – 15 октября 2008г. – Пермь: Изд - во ПГУ, 2008. 20. Пенский О.Г. Программа ранжирования членов коллектива по психологическим характеристикам/ О.Г. Пенский, П.О. Зонова. Свидетельство об отраслевой регистрации разработки No 10517 от 29.04.2008г. Номер государственной регистрации 50200800879. 21. Зонова П.О. Модели психологического темперамента/П.О.Зонова, О.Г. Пенский// Материалы Меж дународной научно - технической конфере н ции «Перспективные технологии искусственного интеллекта» – Пенза, 1 – 6 июля 2008г. – с 50 – 52. 22. Пенский О.Г. О применении основ векторной алгебры в решении некоторых задач исчисления эмоций/ О.Г.Пенский // Электронный жур нал “Исследовано в России”, 1031 – 1034, 2007. URL : http // zhurnal . ape . relarn . ru / articles /2007/099. pdf . – М: МФТИ. 23. Пенский О.Г. Первые итоги прикладной математической теории исчисления эмоций/ О.Г. Пенский, С.В. Каменева// Философско - методологические проблем ы искусственного интеллекта – Пермь: Изд - во ПГТУ, 2007. – С.143 – 149. 24. Пенский О.Г. Программа определения оптимальной последовательности сюжетов для достижения максимального воспитания « Math emoutions »/ П.О.Зонова, О.Г.Пенский. Свидетельство об отраслевой 94 ре гистрации разработки No8999 от 9.06.2007г. Номер государственной регистрации 50200701909. 25. Пенский О.Г. Комплекс программ определения эмоционального состояния субъекта « PsiX - 1»/ А.А. Проничев, Е.В. Левченко, Е.В Бурдакова, О.Г. Пенский. Свидетельство об отра слевой регистрации разработки No7666 от 14.02.2007г. Номер государственной регистрации 50200700313. 26. Пенский О.Г. Расчет эмоций по внешним проявлениям « Pr amoutions »/ О.Г. Пенский, В.С. Русаков / Программа. Свидетельство об отраслевой регистрации разработки No7290 от 27.11.2006. 27. Черников К.В. Программа SoundBot – программа, моделирующая мимическую эмоциональную реакцию робота/ К.В. Черников – Свидетельство Роспатента о государственной регистрации программы для ЭВМ No2010612670 от 24.02.2010. 28. Черников К.В. Мате матические модели контактов эмоциональных роботов/К.В. Черников, О.Г. Пенский// Электронный научный журнал «Университетские исследования» – URL : http :// www . uresearch . psu . ru – 2010. c .1 - 5. 29. Пенский О.Г. Математиче ская модель таланта/ О.Г.Пенский, А.Н. Муравьев, К.В. Черников// Вестник Пермского университета. Математика. Механика. Информатика. No1(1) – Пермь: Изд - во Перм.ун - та, 2010 – с.81 - 84. 30. Пенский О.Г. Обобщение модели эмоционального воспитания/О.Г. Пенский, К. В. Черников// Вестник Пермского университета. Математика. Механика. Информатика. No2(2) – Пермь: Изд - во Перм.ун - та, 2010 – с.55 - 57. 31. Черников К.В. Правила эмоционального поведения роботов. Обобщение на случай произвольного числа взаимодействующих с роботом людей// К.В. Черников// Электронный научный журнал «Университетские исследования» - URL : http :// www . uresearch . psu . ru – 2010, 63_75761. doc – c .1 - 4. 32. Черников К.В. Программа моделирования эмоциональных контактов в группе роботов/К.В. Черников, О.Г.Пенский// Свидетельство об отраслевой регистрации электронного ресурса No15375 от 24.02.2010г. Номер государственной регистрации ВНТИЦ 50201000355. 33. Pensky O.G. Mathematical models of emotional robots : monograph/ URL: http://www.scribd.com – 2010 . – p.96 .