11/2016 - 01/2017
Robots are increasingly becoming part of our daily lives which makes human-robot interaction an important topic to design for. In this course, we explored the technologic possibilities of a domestic drone building a relationship with a human using emotion recognition. In order to determine emotion from the robot’s perspective, we created a neural network. Using facial data from a webcam, extracted using an existing SDK, these emotions were determined. Based on examples of different emotions, the neural network was able to live distinguish between negative and positive emotion. The user would see a screen with the drones’ eyes responding differently to their emotion based on their relationship level. Using our research, we are one step closer to drones becoming part of our daily lives.
Among other things, I have previously tried to create “smart” products which would react in specific ways based on combinations of various inputs and, in some cases, would adapt to new contexts and its users. In hindsight, I can conclude that my history of heuristic intelligence did not scratch the surface of what intelligence in systems really is. I chose to follow this course since the description promised it to be an introduction to a next level of intelligence. I can confidently say that this has been one of the most valuable courses for me personally over the last few years. Mainly because it allowed me to use and apply an intelligent algorithm for the first time, but also because it showed me tools on how to use these in a simple way. After all I am a designer, and I intend to apply these techniques with the least required effort for quick iterations. Because of this, we were able to successfully accomplish our intended goals regarding the application of a neural network, and were left with spare time to incorporate relationship building, be it on a more heuristic level.
Apart from the general concept developing, my specific tasks were mainly in the Processing aspects of this course. In order to come up with the proper eyes for emotion expression of the drone, I made a sketch that would randomly generate eyes. With a simple user interface, we could let people choose an emotion with a certain level, they thought fitted the expression best. This data was exported to an excel file in order to use it for a lookup table. Furthermore, I made the sketch for the demo, which caused challenges in importing data from the affective SDK and using it in the trained network. Overall, I believe I have acquired some practical skills which can directly be applied in future projects with intelligence aspects.