In a robot's shoes

Date:
Changed on 12/12/2019
Amira Chalbi, a PhD student of Tunisian origin with the Mjolnir team at the Inria Lille - Nord Europe center, was the first student volunteer to take part via a mobile robot in the most prestigious conference in the field of human-computer interaction, the CHI 2017 last May in Denver, USA. Here she tells us about this experience and how it enriches her research work.
amira chalbi
Photo M. Blasquez

In December 2016, I was fortunate to be chosen as a student volunteer for the CHI 2017 conference that took place from 6-11 May 2017 in Denver, USA. My thesis lies at the intersection between the fields of human-computer interaction and information visualization, and my research work aims to understand and design animations in graphics interfaces.  

As the CHI 17 student volunteer program was very competitive, it provided me with a tremendous opportunity. Student volunteers receive free registration to the conference and are entrusted with various organizational, coordination or information missions for the 2,800 registered visitors.  

In March 2017, following the announcements by the Trump administration of restrictions with regard to entering the United States, I had very carefully prepared all of the necessary documents - and more - in order to obtain my visa application. What should have been a slightly tedious formality turned into a huge disappointment. The day of the appointment at the embassy, after just a few evasive questions and without any verification of the supporting documents I had brought with me, my application was refused without any clear explanations as to why. My country of origin is not on the list of nationalities that are prohibited from entering the United States.

For the first time in my new career as a researcher, I was going to miss the amazing opportunity of meeting the members of my community, developing my network and discussing my research projects. Following the anger, incomprehension and disappointment - and with the encouragement of my supervisors and the agreement of the conference officials - I decided to take part in the experience by telepresence and thereby prove that discriminatory political restrictions will never prevent the scientific community from interacting and sharing knowledge and human values.

The CHI conference has put in place a telepresence system experiment since 2016; its aim was essentially to enable people with reduced mobility to have the opportunity of attending the conference remotely via a robot called  Beam (in reference to the brand that commercializes it). The Beam is a mobile robot that is steered remotely. It is equipped with a screen, a microphone, a speaker and two cameras: one for the general view and the other to help with driving the robot.

In 2017, because of the Travel Ban and its repercussions on the general visa policy, researchers from several countries across the world found themselves unable to attend the conference. The CHI 17 officials adapted the telepresence program to enable them to be telepresent and even present their work.

With me, it was the first time that the conference officials and the telepresence program had been faced with the case of a student volunteer using Beam. They thought about the tasks I could carry out remotely. And so my mission was to walk around with my Beam during the coffee breaks to announce the forthcoming sessions by sharing the information on my robot's screen. In total, we were 11 scientists and three journalists (including Emily Dreyfuss from WIRED with whom I was able to exchange views and who published an article about her experience) to go through the telepresence experience.

How do you get into a robot's shoes?

It is a rather simple process. After having created an account on the platform managing the connection to the Beams, I received quite a comprehensive guide including all of the practical information needed in order find my bearings in the congress center (maps of the sites, estimated time to reach certain areas, parking spaces and instructions on how to park the Beam in the presentation halls) as well as the schedule of the other Beams. I also had the possibility of customizing my Beam (T-shirt, skirt, baseball cap, etc.), something I obviously did without hesitation with the help of my counterparts on-site.

Even if the robot is controlled via a computer or a smartphone with a WiFi connection, a Beam sometimes needs human help in order to overcome impromptu situations such as, for example, when the robot loses its network connection, when the image becomes blurred (a quick wipe of the camera is most welcome), when you want to take the elevator or quite simply when you have trouble finding your way around the vast conference center.

This physical presence was efficiently assured by those in charge of the telepresence scheme, with the help of a team of student volunteers (my counterparts in Denver). We were connected with all of the volunteers via a Slack chain (a collaborative communication platform that operates like an instant messenger) throughout the conference day, which on average went from 8.30 am to 6 pm, local time. The eight hour time difference with Lille was sometimes difficult to manage. It was a significant constraint that prevented me from being able to attend all of the presentations, but I am rather proud of having been present at the conference for four full days.

What were the interactions between the humans and the robots?

During my daily duties as student volunteer and my trips around the conference center, I had the opportunity to meet and be able to talk with many people who came up to me quite spontaneously. They often introduced themselves by showing their badges to the camera - moreover, this was quite an effective way of making contact. The discussions were relatively short, mainly due to noise pollution, which is often very disturbing when you are in a Beam's shoes but also for the person talking to you. The acoustic experience was clearly one of the main limitations of telepresence, be it during direct discussions or during the presentations. The other, more surprising, aspect of my experience in a robot's shoes - but which was, at the end of the day, just as educational - is linked to the behavior of certain people who interacted with me as they would with an object or a decorative element. Imagine yourself in the middle of making your announcement during a coffee break when someone comes and poses next to you to take a selfie without even saying a word to you.

From a relational perspective, taking part in CHI in a robot's shoes was, for me, a very rich human and social experience. Meeting new faces and hearing people express their joy at the opportunity of being able to attend via telepresence was very enjoyable and encouraging. A somewhat shy person, I gained an enormous amount of self-confidence and self-assurance moving around among the thousands of visitors along with my 13 other robot counterparts.

The tremendous amount of curiosity we aroused enabled me to discover some very spontaneous and positive behavior towards me. I think that a large number of visitors will remember me, something which - in the context of my research activities - is obviously a very good thing.

I recommend my personal experience to all those who have the opportunity to do so; however it does not replace real-life encounters. One day I hope to return to CHI to present my research work, and I will always pay special attention to those people who choose to take part by telepresence.