Skip to content

How do mobile devs make privacy work?

April 30, 2017

I recently published a pair of articles with Katie Shilton exploring how mobile app developers help each other learn what privacy means and how to build that abstract value into their software. Katie and I analyzed hundreds of forum conversations about privacy among iOS and Android developers, and compared the different development cultures and privacy values that arose around each platform.

Our first piece, in the Journal of Business Ethics, explores the work practices that trigger privacy conversations in different development ecosystems. We found that

The rules, regulations, and cultural norms that govern each ecosystem impact day-to-day work practices for mobile developers. These differing work practices in turn shape the ethical deliberations engaged in by forum participants, addressing the question of why privacy is debated—and ultimately designed for—so differently between the two ecosystems.

Ultimately, even though the prompts for ethical debates among developers differed widely between iOS and Android, the tactics they used to convince each other that privacy was an important problem for design were remarkably similar. This means that Apple and Google are important privacy regulators, because the way they structure their development environments influences when and how app developers think about privacy.

Our second piece, in New Media & Society, is more interested in exactly that question: how Google and Apple regulate the thousands of app developers who do not work for them, how platform values become developer values. Platforms need apps to attract users, but developers have to play by platforms’ rules and learn their values in order to get to market. Our close study of developer cultures helped answer an important questionin privacy research, namely: Why do Android apps leak so much more consumer data than iOS apps?  We show that Apple’s close, but, to developers, somewhat capricious, regulation of submitted apps leads to an ‘invisible fence’ effect where developers are constantly working together to figure out what Apple means by ‘privacy’ so that they can get their apps approved and into the market. In contrast, Android’s relatively open and lax development environment leads to a Wild West atmosphere where anything goes and where developers work together with highly-skilled users to build defensive measures against perceived threats—including Google. While average users might be left out of this digital arms race, that’s more of a feature than a bug for Android app developers:

For devs and skilled hobbyists, Android enabled access to privacy-enhancing applications, limited only by skill and literacy. The charms—but also the underlying inaccessibility—of Android privacy were summed up by mavenz in a 2013 thread about privacy problems in the Facebook app: “Whatevs. This is why i < 3 android. I can just hack something better.”

As researchers interested in ethical challenges in new media environments, we also reflected on the ethics of researching public forums. And we hope we can help move that important methodological conversation forward as well.

FATE postdoctoral position opens at MSR NYC

March 20, 2017

Exciting news! The FATE group (Fairness, Accountability, Transparency, and Ethics) at Microsoft Research New York City (MSR NYC) is looking for a postdoctoral researcher to start in July, 2017. This one-year position is an ideal opportunity for an emerging scholar whose work focuses on the social impacts of machine learning and AI. Application deadline: April 3, 2017.

Postdoctoral researchers receive a competitive salary and benefits package, and are
eligible for relocation expenses. Candidates must have completed their PhD, including submission of their dissertation, prior to joining MSR NYC (i.e., PhD submitted and preferably conferred by July 2017). We encourage candidates with tenure-track job offers from other institutions to apply, provided they are able to defer their start date to accept our position.

Microsoft Research provides a vibrant multidisciplinary research environment, with an open publications policy and close links to top academic institutions around the world. Postdoctoral researcher positions provide emerging scholars with an opportunity to develop their research career and to interact with some of the top minds in the research community. Postdoctoral researchers define their own research agenda. Successful candidates will have a well-established research track record, evidenced by top-tier journal or conference publications, as well as a strong service record (e.g. participation on program committees, editorial boards, and advisory panels).

While each of the Microsoft Research labs has openings in a variety of different disciplines, this position with the FATE group at MSR NYC is specifically intended for researchers who are interested in challenges related to fairness, transparency, accountability, and ethics in machine learning and AI. The FATE group includes Kate Crawford, Hanna Wallach, and Solon Barocas, among others. For a sampling of recent publications see their respective websites.

We will consider candidates with a background in a technical field (such as machine learning, AI, or NLP) as well as candidates who study socio-technical questions in the fields of anthropology, media studies, sociology, science and technology studies, and related fields.

The ideal candidate should have a demonstrated interest in the social impacts of machine learning and AI, and be interested in working in a highly interdisciplinary environment that includes computer scientists, social scientists, critical humanists, and economists.

To apply, please submit an online application on the Microsoft Research Careers website: https://careers.research.microsoft.com/

In addition to your CV and names of three or more referees (including your dissertation advisor), please upload the following materials:

* Two journal/conference publications, articles, thesis chapters, or
   equivalent work samples (uploaded as two separate attachments).

* A research statement (maximum length three pages) that 1) outlines
your research agenda (~one page); 2) provides a description and,
if appropriate, a chapter outline of your dissertation (~one page);
3) offers an explanation of how your research agenda relates to
fairness, accountability, transparency, and ethics (~one page).

Please indicate that your location preference is “New York” and include “Kate Crawford” as the name of your Microsoft Research contact (you may include additional contacts as well). Note: if you do not do this, it is *very unlikely* that we will receive your application.

After you submit your application, a request for letters will be sent to your referees on your behalf. You may wish to alert your referees in advance so that they are ready to submit your letter by April 3, 2017. You can check the progress on individual letter requests by clicking the “status” tab within your application page.

Microsoft is committed to building a culturally diverse workforce and strongly encourages applications from women and minorities.

 

Introducing our SMC interns for summer 2017!

March 8, 2017

We get the sharpest, most impressive crop of applicants for ourSocial Media Collective internship, it is no easy task to turn away so many extremely promising PhD students. But it is a pleasure to introduce those we did select. (Keep in mind that we offer these internships every summer; if you will be an advanced graduate student in our field in the summer of 2018, keep an eye on this blog or for updates to this page for the next deadline.) For 2017, we are proud to have the following young scholars joining us:

At Microsoft Research New England

  Ysabel Gerrard is a PhD Candidate in the School of Media and Communication, University of Leeds. Her doctoral thesis examines teen drama fans’ negotiations of their (guilty) pleasures in an age of social media. In addition to her research and teaching, Ysabel is the Young Scholars’ Representative for ECREA’s Digital Culture and Communication section, and is currently co-organising the Data Power Conference 2017 (along with two others). She has published in the Journal of Communication Inquiry and has presented her work at numerous international conferences, such as ECREA (European Communication Research and Education Association) and Console-ing Passions. Ysabel will be investigating Instagram and Tumblr’s responses to public discourses about eating disorders.

 

Elena Maris is a PhD Candidate at the Annenberg School for Communication at the University of Pennsylvania. Her research examines the ways media industries and audiences work to influence one another, with a focus on technological strategies and the roles of gender and sexuality. She also studies the ways identity is represented and experienced in popular culture, often writing about race, gender and sexuality in television, fandom and Hip-Hop. Her work has been published in Critical Studies in Media Communication and the European Journal of Cultural Studies.

 

At Microsoft Research New York City:

Aaron Shapiro is a PhD candidate at the University of Pennsylvania’s Annenberg School for Communication. He also holds an M.A. in Anthropology and a Graduate Certificate in Urban Studies. Aaron previously worked as a field researcher and supervisor at NO/AIDS Task Force in New Orleans, conducting social research with communities at high risk for HIV. His current research addresses the cultural politics of urban data infrastructures, focusing on issues of surveillance and control, labor subjectivities, and design imaginaries. His work has been published in Nature, Space & CultureMedia, Culture & Society, and New Media & Society. He will be working on a study about bias in machine learning.

What’s queer about the internet now?

February 28, 2017

This past month, I organized the Queer Internet Studies Workshop with my longtime friend and collaborator, Jack Gieseking, and Anne Esacove at the Alice Paul Center at UPenn.  This was the second QIS (the first was in 2014 at Columbia), and our plan was to organize a day long series of conversations, brainstorming sessions, panels, and chats dedicated to broaden thinking about the internet.  Rather than a formal conference of people presenting their research, QIS is intended (1) to identify what a queer internet might look like (2) to give a sense of research that’s being done in this area, and (3) to collaborate on artistic, activist and academic projects.We’ve been lucky to have folks post some terrific blog posts about the event, but here’s a quick recap.  After opening the day with group discussions about what queer internet studies might be and how (or whether) we could study it, a carefully curated group of researchers and activists shared their expertise in a facet of queerness and media.

  • Mia Fischer talked about intersections between trans people and surveillance studies.
  • Oliver Haimson described his work on trans identity and social media.
  • Carmen Rios spoke about online communities and feminist politics.
  • Adrienne Shaw  shared her work about building an LGBT games archive.
  • Mitali Thakor shared her work on digital vigilante-ism against sex trafficking.

Artist and academic TL Cowan led a participatory workshop called “Internet of Bodies/Internet of Bawdies.” Part theoretical inquiry, part brainstorming session, and rapid prototyping exercise, the workshop offered an embodied means of working through sexuality, performativity, and technological change.

Rather than a traditional keynote dialogue, we asked Katherine Sender to act as an interlocutor for Shaka McGlotten. Their dialogue ranged from racism and desire to sped up and slowed down experiences of intimacy, from surveillance and performativity to social media platform politics. As a freeform conversation, Sender and McGlotten both addressed and reworked themes that had surfaced throughout the day around queerness, technology, and desire.

We closed the day with breaking into groups to talk about outcomes, which included pooling resources to develop syllabuses and course materials, collaborating on a special issue, and developing best practices around respecting privacy and ownership of online content.  I’m excited to see where these plans and provocations end up in the coming months.  A huge thanks to my co-organizers, the attendees and speakers, and our sponsors.  In 2017, it’s clear that we need spaces for queerness and media provocation more than ever, it’s my hope that QIS can continue to be a space for those connections and creativity, both as a physical meetup and as a chance to build enduring social ties.

Learn It, Buy It, Work It! Performing Pregnancy on Instagram

January 20, 2017

Katrin Tiidenberg (Aarhus University, Denmark and Tallinn University, Estonia) and SMC Principal Researcher Nancy Baym (Microsoft Research, New England) have recently published an article in Social Media + Society that analyzes how pregnancy is performed on Instagram. According to Tiidenberg and Baym,

‘Pregnancy today is highly visible, intensely surveilled, marketed as a consumer identity, and feverishly stalked in its celebrity manifestations. This propagates narrow visions of what a “normal” pregnancy or “normal” pregnant woman should be like.’

Drawing on Tiidenberg’s work during her Ph.D. internship with the SMC (2014), the article asks:

‘[W]hether they [women] rely on and reproduce pre-existing discourses aimed at morally regulating pregnancy, or reject them and construct their own alternatives.’

You can read their findings here.

“Just how artificial is Artificial Intelligence?”

January 9, 2017

SMC member Mary L. Gray (Microsoft Research, New England; Berkman Kein Center for Internet and Society) and colleague Siddharth Suri (Microsoft Research, New York) have published an article for the Harvard Business Review asking, “just how artificial is Artificial Intelligence?”

Whether it is Facebook’s trending topics; Amazon’s delivery of Prime orders via Alexa; or the many instant responses of bots we now receive in response to consumer activity or complaint, tasks advertised as AI-driven involve humans, working at computer screens, paid to respond to queries and requests sent to them through application programming interfaces (APIs) of crowdwork systems. The truth is, AI is as “fully-automated” as the Great and Powerful Oz was in that famous scene from the classic film, where Dorothy and friends realize that the great wizard is simply a man manically pulling levers from behind a curtain.

For Gray and Suri, the mythos of “full-automation” is akin the Great and Powerful Oz, famously depicted as a man “manically pulling levers from behind a curtain” in the classic American film.

This blend of AI and humans, who follow through when the AI falls short, isn’t going away anytime soon. Indeed, the creation of human tasks in the wake of technological advancement has been a part of automation’s history since the invention of the machine lathe.

Full text of the article is available here.

Amplifying the Presence of Women in STEM

December 8, 2016

December 7-13th is Computer Science Education Week!

Recently, feminist media scholars have demanded we take seriously seriously the dearth of women and people of color in computing fields. This week presents the opportunity to broadcast professional role models to inspire young minority techies in pursuit of their STEM dreams, both in industry and in academia.

predictions-msr-main-003b

Source: Microsoft Corporate Blogs

Mary L. Gray, senior researcher at the Social Media Collective, was recently featured in Microsoft’s “17 for ’17: Microsoft researchers on what to expect in 2017 and 2027,” which sought to work against this gap by highlighting 17 women from within their global research organization.

Mary offers insights on the digital world we should anticipate over the next decade and where to position ourselves as scholars.