StyleArm: a style-transferring robot arm

Author: Vernon Stanley Albayeros Duarte

Virtual Room: https://eu.bbcollab.com/guest/9b6e16af7c894e3a8e0ac8f5abc4fdf3

Date & time: 22/09/2021 – 11:00 h

Session name: Human modelling

Supervisor: Fernando Vilariño

Abstract:

As the paradigm of human-computer interaction shifts to increasingly intelligent systems that require no direct user inputs to provide services, the way we interact with our machines is still primarily “active interactions”, as the computer requires some sort of direct user input to provide its content. In this master’s thesis, we create a physical proof of concept that implements computer vision algorithms to aid in interaction, in the form of a self-built robotic arm with a camera that is able to interface with a Raspberry Pi, a small computer. The proof of concept utilizes a server-client connection with a higher-powered machine capable of relaying images treated by a fast Style Transfer GAN in near real-time, switching the context of the style transfer depending on the facial expression of the user. The goal of this robot is to provide the user with a way to interact with computer vision processes they might find useful, like Style Transfer or image stitching for story-telling and publication on social media accounts. Initially, this project was meant to optimize a style transfer GAN for use on a Raspberry Pi and provide a completely autonomous project, but we were unable to produce results that could be used in real-time.

Committee:

– President: Pau Rodriguez(UAB)
– Secretary: Guillem Arias (UAB)
– Vocal: Ramon Baldrich Caselles(UAB)