Description (in English)

"U" is a fictional video, adapted from interviews with data monitors from Songdo, South Korea. Beyond being one of the largest and most expensive development projects in the world, Songdo advertises itself as an "ubiquitous" or "U-city," implementing smart technologies at every tier of urban design, from public infrastructure to residential units. Its city planners, in short, aim to build a metropolis-as-information network.
During the summer of 2014, I lived in Songdo, where I conducted interviews with members of the technology and real estate companies developing the master plan, as well as with employees of the Integrated Operations Center, which monitors the surveillance and sensor data generated by the city. At the time I visited the Center, several employees were experiencing psychological duress from the relentless monitoring work, and their supervisor was in the process of introducing therapy sessions.

My video departs from this anecdote, imagining a contemporary form of Gestalt psychotherapy, which shares affinities with the employees’ work.  Eschewing any depth model, Gestalt focuses on the “here and now”: on what is apprehensible in the present moment through sense experience. The therapist is thus akin to a surveillance camera, scanning for modulations in a patient’s body, language, and tone of voice.  From my perspective, this resonates with smart city planning, wherein the sense data aggregated by sensor and surveillance technologies suffice as tools of governance. Life is managed by bandwidth; big data rewrites the law.     Gestalt is also relevant, as patients are asked to assume the voices of the people, objects and emotions in their dreams. The self is thus conceived as an expansive network, within which a patient can learn to map near and distant psychical points.  This approach reminds of “The Internet of Things,” the rhetoric smart cities use to suggest that everything (a human, a sensor, a television, etc.) is equal within the network.
Beyond these references, my video tells a character-driven story.  Over the course of ten, fragmentary sessions, three characters find different ways of negotiating a challenging situation, in which the lines between personal therapy and professional performance are nearly impossible to draw.  Technology’s remaking of the world, the video implies, extends from the scale of the built environment to the most intimate notions of self. 

Source: https://www.tylercoburn.com/u.html


Situation machine vision is used in
Here are some notes linked to my interpretation of the work:

0:33 >>>Alex trying to be the camera feeling that she can’t do it. Therefore, I tagged “Alien” because she is not yet realizing that she is part of the system yet. This will change throughout the therapy session which I read as a metaphor of a machine learning session.>>>

So, you are surveilling the floor?
Feel a bit like a camera?

Alex (A):
Sort of. Punch in, pan left, zoom out?
It is not so different?

T: What if I would ask you to become one of the two cameras in the room?
Could you do that?

A: Well, I have no idea what a camera would say?

T: Do you think you could invent something for the camera to say?

A: I don’t think so.

8:50 >>>In my reading Rita is the most human, she is trying to become more robot like, but she is “overwhelmed” of the amount of information. >>>

Rita (R): talking about her work in the control room
R: I am going total robot here.
T: How does it feel to do that the whole day?
R: I don’t know, it sucks
R: We only sense what the operation center wants us to sense, what is a fuck load

09:40 >>>The situation is implied to be hostile, they are not sure what their supervisor/therapist/the system wants out of them. However, Rita tries to keep calm convincing the others that they are in a safe space. >>>
R: Oh, Lord he thinks I am trying to get him in trouble. Call, this is a safe space.

Overwhelming, Laborious
9:50 >>>Back to Rita’s ability to become robot like, to manage the system>>>
T: Play the operation center…
R: I am big, big data
R: I got a fuck load of boring stuff on my screens, every inch of the city, I am a perfect copy, no, no I am realer than real.

R: In this scene I am looking a one single with one surveillance feed
Then 2, 4, 16….
(other scenes described becoming harder and harder to follow)
T: It looks like it is hard to do…

>>> Rita admits that she has not become a total robot>>>

Oppression (?)
19:00 >>>The therapist References to Agosto Boel’s games to help people recognize oppression in their lives. We carry oppressions in our bodies when we adapt to our jobs and routines and social expressions our senses can suffer. Boel’s games were designed to de-mechanize theses routines.>>>

Two last scenes: Call, Therapist and Alex, implying they might be something else than humans in a (Timecode: 33:00 >>)

Call (C): We sense what the sensors sense.
We let the big data govern, we are rewriting the world as a crime scene
>>>… stopped by the therapist for creating a difficult environment>>>

>>>Call goes on he knows why everyone is here:>>>
Who can become the best calibrated monitor?
Who can prey open the innermost self and teach it to behave?
>>>(Teach, train Machine learning?)>>>

C: I am a fuck up….

C: … I am just a link in a chain of fucked-up decisions…
C: Who decided to make sensors and surveillance cameras the measure of all things and why are we the ones monitoring them?

C: Try to imagine what it is like to watch, to wait, to spend your entire working live anticipating the crime, the accident, the crisis, that may never come.
At first you threat the possibility then you find yourself wanting it…

C: You, human error and on the other side is this great inhuman force stitching it to reality.
(decreased agency)

C: Tell me what you are?
T: It is not that simple, and the way you ask the question I think you already have an answer. (Raises the question: Is the therapist a trainer of AI?)

>>>The last clip with Alex and the Therapist implies that the therapist is the finalizing the training of the sensing network. Alex is the chosen one, while Rita is too human and Call is questioning enough. Human and technical consciousness intertwine?>>>

I also tagged intimate and intrusive in the overall sentiment because in the therapy session the participants reveal very personal feelings and vulnerabilities and the therapist is very intrusive in trying to reveal the true nature of the attendants.

In the general description of the operator I chose human and machine, even if all the characters are tagged human, I think the end of the end of the video implies that there is more into it: human and technical consciousness intertwine.

Authored by