Wednesday, November 1, 2017

My body is not your empathy gimmick

echolaliachamber writes My body is not your empathy gimmick at echolaliachamber.

My body is not your empathy gimmick


Today is Autistics Speaking Day 2017. It is the day we flash-blog in an attempt to tease out higher SEO ratings than the anti-autistic “autism warrior” “I love my child but I hate my child’s autism” establishment of autism “advocacy” – a stunt to try and snag more eigenbuddies than our oppressors so that parents of autistic children might find us instead of them. A prayer that they’ll find acceptance instead of fear.

Today is also just a few days shy of a month from the 2017 Grace Hopper Celebration of Women in Technology. I attended this year, along with 18,000 other people. It was overwhelming, of course. There were many good sessions which focused on diversity, inclusion, and intersectionality (Kimberle Crenshaw was actually credited for her term in one of them which was astonishing all by itself). As a whole the conference organizers were making a concerted effort to make Grace Hopper about women of all backgrounds, not just white cis women. There was still a disappointing presence of transexclusionary/genital essentialist language, particularly against trans women and femmes. And though I did see other disabled people around, and was grateful to connect with a few, there were many panels that were intensely ableist. Some of which I even had to walk out of for my own well being.

First there was the panel on Accessible Technology, which included 5 visibly able-bodied panelists on a stage that a person with a physical disability would not be able to access. Being myself an alphabet soup of invisible disabilities, both cognitive and physical, I will not claim that no one on the panel was disabled, however they were definitely presenting from a position of able-bodiedness and positions of authority over disabled people. This isn’t good enough. We need disabled perspectives on panels about technology accessibility and accessible interfaces. We need this not just because we should be listening to disabled people about what accessible tech means to them, but because disabled developers exist. Accessible technology isn’t just about helping the citizen with disabilities. Accessible technology is also about diversity in computing professions. We’re here. We use eye tracking software to code (among many other things). Our efficiency is dependent upon your gatekeeping, not our ability.

The panel covered some good topics. Disability studies often leaves aging populations out as a footnote, despite the fact that, if we’re lucky, we’ll all live long enough to be disabled, so I really appreciated the panel highlighting the aging population as critical consumer group of accessible technology. I also had a deep appreciation for the segment on user centered design and a panelist asserting that when you ask people with disabilities to contribute their knowledge and experience to your project that you compensate them for your time. I would also like to see more disabled collaborators being truly included as project members with full credit as well. I did not appreciate appeals to the economics of accessible technology. The economy of disability, spearheaded by our nation’s favorite Super Crip, FDR himself, was and continues to be a capitalist ploy that keeps us segregated in sheltered workshops and special programs. It isn’t a good look. Economies of any axis of diversity are inherently othering and I’d like to see people dropping this narrative entirely in favor of more compelling and dignified appeals to inclusion.

I was genuinely looking forward to the panel on Virtual Humanity. I have worked in video games, augmented reality, and virtual reality for many years, and compelling and empathetic virtual agents are critical to meaningful virtual experiences. I was not prepared for what I experienced.

A presenter from Facebook was describing the new Oculus project, Facebook Spaces. Spaces allows users to connect with friends in fully virtual chat rooms. With an Oculus head mounted display and motion sensor, users are teleported into these spaces, riding around in a cartoon avatar designed to look like them using their favorite profile photo. The presenter spoke about finding team meetings held in Facebook spaces much more engaging, that connection with her team members was made easier in this shared virtual space than in the fragmented distorted windows of a video conference. I found myself in agreement with the presenter, there’s always something about the quality of connection in an online video conference that somehow manages to combine all the most difficult aspects of information processing. Yet I was surprised to find myself pushing back feelings of panic, wiping my palms on the knees of my pants, rocking as subtly as I could manage in the audience, surrounded by strangers, struggling to breathe. What was upsetting me so much?

Eye contact. The presenter explained that eye contact was the reason that Spaces was better for team meetings than video conferencing. Eye contact was lauded as so crucial to meaningful connection that it is actually procedurally enforced within Facebook Spaces. All avatars maintain eye contact with their conversational partners at all times.

Let that sink in. Eye contact is enforced procedurally at all times. And more — Body gesticulations are interpreted into standard facial expressions automatically.

Do you see what I see? The panel is called Virtual Humanity and the first presentation defines humanity as explicitly exclusive of autistic mannerisms.

Let’s pull back.

Virtual Reality has been heralded as the empathy machine. Various and voluminous perspective taking experiences have been built, claiming to hold the key to breaking down bias and fostering human connection. Nevermind the fact that disability scholars have already roundly covered the ways in which simulation exercises actually contribute to stigma and othering – developers are attempting to ‘solve’ the ‘problem’ of disability by letting people ‘walk in our shoes.’

And yet.

And yet.

You will not even allow us the dignity of existing in our own skin within these virtual worlds.
Someone from Microsoft (who I deeply respect but will not name without permission), boldly pointed out that Facebook Spaces doesn’t allow for non-normative bodies. There were no accessibility aids to choose from for your avatar. There were no options to adjust the presentation of limbs. There was no mention of accommodating sensory disabilities, for which, despite popular perceptions, headsets are actually uniquely suited to do. When given the opportunity to respond to this criticism, the panel responded with the predictable but spectacularly disappointing refrain. “It’s so hard, because you have to get it right.” “It takes so long to make models for those sorts of things.” “Well if you don’t have an arm or something, the kinect just won’t drive that limb.” (!!??)

This is beyond not good enough. It’s embarrassing. First of all, as someone who has worked in 3D modeling and animation for over a decade, no. Nice try. Modeling mobility aids for users to personalize their avatars is not cost prohibitive. To match the art style presented in spaces, it would take, at a generous estimate, a total of 8 work hours to produce a 3D model of a futuristic, fictional, all-purpose mobility aid suitable for a prototype like Spaces, including integration and deployment to the Spaces system. I’m not saying it would take a single work day. But it wouldn’t take longer than a week devoting only a small portion of someone’s day at a time. And to allow users to customize limbs? You already let them choose a handful of different eyebrows. They can have access to some joint scalars. It does not require unique avatar models to throw someone a bone. Pun intended.

But let’s get back to the excessively normalized body language. Hi. *waves autistically* Please do not force virtual-me to make eye contact or generate normative facial expressions. Thank you. These are literally points of conflict for us our entire lives. Other people spend way too much of their time and effort trying to normalize our gaze and our external expression of internal states. As the developers of this new virtual communication medium, you have the power to force us into normative skins in order to connect with friends, family, and coworkers. Please don’t. It’s absolutely horrifying to know that if I were to connect with Facebook Spaces, my autistic way of moving and being would be literally and deliberately erased by your platform.

Empathy machine? How are you going to cultivate empathy if you make us all look exactly the same? How are you going to cultivate empathy making toys to put us on like a costume but then deny us access to navigating virtual worlds in our own skins?

Read the original post here.

1 comment:

  1. I am rather surprised your harsh critism.VR is still a very young tach and developing. Of course they are going to be developed for the majority and be studied and defined aspects of society. They are also going to develope for Mass appeal. I understand the stress of such thing as eye contract. But to the same token other suffer from stress because of lack of eye contact. The everyone just get along is not a solution to communication. Compremise and communicating is the solution to communication. As times go I believe more cultural, behavior, by types will be developed. In fact out side of business application you are going to see demand alone generate different feature because people will want them just like in a over online world's and games. I think you are reading a bit too much in this early introduction. It reminds me of when voice recognition was first introduced verse today (though that still need more improvement too) it has come a long way in a relatively short time for a complex tech.

    ReplyDelete

Open discussion is encouraged, but posts judged to be bullying or using inappropriate languages may be deleted. Please exercise good judgment when commenting. Comments will be moderated.