عنوان مقاله [English]
Social robots that are fabricated to interact with humans and to help them in education, healthcare, etc., are required to have an interactive behavior similar to humans. One of the important interactive behaviors of humans is social eye gaze. Eye gaze is significantly more important than other nonverbal signals; it is shown that eyes are special cognitive stimuli with unique hardwired pathways in the brain dedicated to their interpretation. Studying the literature, we found out that in previous researches conducted to control the social robots’ gaze behavior, human gaze behavior was investigated in some limited situations such as two- or three-way conversation in order to extract the pattern of this behavior. Therefore, increasing the variety of studied social situations is a way to fill this gap. In order to design a gaze control system for a social robot, it is required to find out details about the human gaze behavior. The purpose of this research is to propose an empirical motion-time pattern for human gaze behavior in a number of different social situations; these situations include scenes with 2 to 4 people in a prepared video where the people in the scene show the social behaviors of "talking", "waving", "pointing", "entering the scene" and "exiting the scene" in a structured way. Fifteen normal adults (mean age: 24 and std: 3.3 years) watched this movie and their gaze position was recorded by an eye tracker system (SR-Research EyeLink 1000 plus). Next, by using the genetic algorithm (which is an optimization process), we were able to extract the relative coefficient of each of the mentioned social behaviors in our proposed model. The results of reconstructing the participants' gaze on the test data are very similar to the real performance of the subjects. Finally, the ability to implement this model was successfully tested by implementing it on a Nao robot, and its positive performance was confirmed using a survey. The model showed significant differences between the two studied situations in 3 questions out of the whole survey’s 10 questions.