Well, after a truly exciting spell of reviewing an AMAZING set of applications for our 2015 PhD Internship Program, we had the absolutely excruciating task of selecting just a few from the pool (note: this is our Collective’s least favorite part of the process).
Without further ado, we are pleased to announce our 2015 Microsoft Research SMC PhD interns:
At MSR New England:
Aleena Chia is a Ph.D. Candidate in Communication and Culture at Indiana University. Her ethnographic research investigates the affective politics and moral economics of participatory culture, in the context of digital and live-action game worlds. She is a recipient of the Wenner-Gren Dissertation Fieldwork grant and has published work in American Behavioral Scientist. Aleena will be working with Mary L. Gray, researching connections between consumer protests, modularity of consumer labor, and portability of compensatory assets in digital and live-action gaming communities.
Stacy Blasiola is a Ph.D. Candidate in the Department of Communication at the University of Illinois at Chicago and also holds an M.A. in Media Studies from the University of Wisconsin at Milwaukee. Stacy uses a mixed methods approach to study the social impacts of algorithms. Using the methods of big data she examines how news events appear in newsfeeds, and using qualitative methods she investigates how the people that use digital technologies understand, negotiate, and challenge the algorithms that present digital information. As a recipient of a National Science Foundation IGERT Fellowship in Electronic Security and Privacy, her work includes approaching algorithms and the databases that enable them from a privacy perspective. Stacy will be working with Nancy Baym and Tarleton Gillespie on a project that analyzes the discursive work of Facebook in regards to its social newsfeed algorithm.
J. Nathan Matias
Nathan Matias is a Ph.D. Student at the MIT Media Lab Center for Civic Media, a fellow at the Berkman Center for Internet and Society, and a DERP Institute fellow. Nathan researches technology for civic cooperation, activism, and expression through qualitative action research with communities, data analysis, software design, and field experiments. Most recently, Nathan has been conducting large-scale studies and interventions on the effects of gender bias, online harassment, gratitude, and peer thanks in social media, corporations, and creative communities like Wikipedia. Nathan was a MSR Fuse Labs intern in 2013 with Andrés Monroy Hernández, where he designed NewsPad, a collaborative technology for neighborhood blogging. Winner of the ACM’s Nelson Prize, Nathan has published data journalism, technology criticism, and literary writing for the Atlantic, the Guardian, and PBS. Before MIT, he worked at technology startups Texperts and SwiftKey, whose products have reached over a hundred million people worldwide. At MSR, Nathan will be working with Tarleton Gillespie and Mary L. Gray, studying the professionalization of digital labor among community managers and safety teams in civic, microwork, and peer economy platforms. He will also be writing about ways that marginalized communities use data and code to respond and reshape their experience of harassment and hate speech online.
At MSR New York City:
Ifeoma Ajunwa is a Paul F. Lazersfeld Fellow in the Sociology Department at the University of Columbia. She received her MPhil in Sociology from Columbia University in 2012. She was the recipient of the AAUW Selected Professions Fellowship in law school after which she practiced business law, international law, and intellectual property law. She has also conducted research for such organizations as the NAACP, the United Nations Human Rights Council, the ACLU of NY (the NYCLU), and UNESCO. Her prior independent research before graduate school include a pilot study at Stanford Law School where she interrogated the link between stereotype threat and the intersecting dynamics of gender, race, and economic class in relation to Bar exam preparation and passage. Ifeoma’s writing has also been published in the NY Times, the HuffingtonPost, and she has been interviewed for Uptown Radio in NYC. She will be working with Kate Crawford at MSR-NYC on data discrimination.
On March 16, Facebook updated its “Community Standards,” in ways that were both cosmetic and substantive. The version it replaced, though it had enjoyed minor updates, had been largely the same since at least 2011. The change comes on the heels of several other sites making similar adjustments to their own policies, including Twitter, YouTube, Blogger, and Reddit – and after months, even years of growing frustration and criticism on the part of social media users about platforms and their policies. This frustration and criticism is of two minds: sometimes, criticism about overly conservative, picky, vague, or unclear restrictions; but also, criticism that these policies fall far too short protecting users, particularly from harassment, threats, and hate speech.
“Guidelines” documents like this one are an important part of the governance of social media platforms; though the “terms of service” are a legal contract meant to spell out the rights and obligations of both the users and the company — often to impose rules on users and indemnify the company against any liability for their actions — it is the “guidelines” that are more likely to be read by users who have a question about the proper use of the site, or find themselves facing content or other users that trouble them. More than that, they serve a broader rhetorical purpose: they announce the platform’s principles and gesture toward the site’s underlying approach to governance.
Facebook described the update as a mere clarification: “While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.” Most of the coverage among the technology press embraced this idea (like here, here, here, here, here, and here). But while Facebook’s policies may not have changed dramatically, so much is revealed in even the most minor adjustments.
First, it’s revealing to look not just at what the rules say and how they’re explained, but how the entire thing is framed. While these documents are now ubiquitous across social media platforms, it is still a curiosity that these platforms so readily embrace and celebrate the role of policing their users – especially amidst the political ethos of Internet freedom, calls for “Net neutrality” at the infrastructural level, and the persistent dreams of the open Web. Every platform must deal with this contradiction, and they often do it in the way they introduce and describe guidelines. These guidelines pages inevitably begin with a paragraph or more justifying not just the rules but the platform’s right to impose them, including a triumphant articulation of the platform’s aspirations.
Before this update, Facebook’s rules were justified as follows: “To balance the needs and interests of a global population, Facebook protects expression that meets the community standards outlined on this page.” In the new version, the priority has shifted, from protecting speech to ensuring that users “feel safe:” “Our goal is to give people a place to share and connect freely and openly, in a safe and secure environment.” I’m not suggesting that Facebook has stopped protecting speech in order to protect users. All social media platforms struggle to do both. But which goal is most compelling, which is held up as the primary justification, has changed.
This emphasis on safety (or more accurately, the feeling of safety) is also evident in the way the rules are now organized. What were, in the old version, eleven rule categories are now fifteen, but they are now grouped into four broad categories – the first of which is, “ keeping you safe.” This is indicative of the effect of the criticisms of recent years: that social networking sites like Facebook and Twitter have failed users, particularly women, in the face of vicious trolling.
As for the rules themselves, it’s hard not to see them as the aftermath to so many particular controversies that have dogged the social networking site over the years. Facebook’s Community Standards increasingly look like a historic battlefield: while it may appear to be a bucolic pasture, the scars of battle remain visible, carved into the land, thinly disguised beneath the landscaping and signage. Some of the most recent skirmishes are now explicitly addressed: A new section on sexual violence and exploitation includes language prohibiting revenge porn. The rule against bullying and harassment now includes a bullet point prohibiting “Images altered to degrade private individuals,” a clear reference to the Photoshopped images of bruised and battered women that were deployed (note: trigger warning) against Anita Sarkessian and others in the Gamergate controversy. The section on self-injury now includes a specific caveat that body modification doesn’t count.
In this version, Facebook seems extremely eager to note that contentious material is often circulated for publicly valuable purposes, including awareness raising, social commentary, satire, and activism. A version of this appears again and again, as part of the rules against graphic violence, nudity, hate speech, self injury, dangerous organizations, and criminal activity. In most cases, these socially valuable uses are presented as a caveat to an otherwise blanket prohibition: even hate speech, which is almost entirely prohibited and in strongest terms, now has a caveat protecting users who circulate examples of hate speech for the purposes of education and raising awareness. It is clear that Facebook is ever more aware of its role as a public platform, where contentious politics and difficult debate can occur. Now it must offer to patrol the tricky line between the politically contentious and the culturally offensive.
Oddly, in the rule about nudity, and only there, the point about socially acceptable uses is not a caveat, but part of an awkward apology for imposing blanket restrictions anyway: “People sometimes share content containing nudity for reasons like awareness campaigns or artistic projects. We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes.” Sorry, Femen. On the other hand, apparently its okay if its cartoon nudity: “Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes.” A nod to Charlie Hebdo, perhaps? Or just a curious inconsistency.
The newest addition to the document, and the one most debated in the press coverage, is the new way Facebook now articulates its long-standing requirement that users use their real identity. The rule was recently challenged by a number of communities eager to use Facebook under aliases or stage names, as well as by communities (such as Native Americans) who find themselves on the wrong side of Facebook’s policy simply because the traditions of naming in their culture do not fit Facebook’s. After the 2014 scuffle with drag queens about the right to use a stage identity instead of or alongside a legal one, Facebook promised to make its rule more accommodating. in this update Facebook has adopted the phrase “ authentic identity,” their way of allowing adopted performance names but continuing to prohibit duplicate accounts. The update is also a chance for them to re-justify their rule: at more than one point in the document, and in the accompanying letter from Facebook’s content team, this “authentic identity” requirement is presented as assuring responsible and accountable participation: “Requiring people to use their authentic identity on Facebook helps motivate all of us to act responsibly, since our names and reputations are visibly linked to our words and actions.”
There is also some new language in an even older battle: for years, Facebook has been removing images of women breastfeeding, as a violation its rules against nudity. This has long angered a community of women who strongly believe that sharing such images is not only their right, but important for new mothers and for the culture at large (only in 2007, 2008, 2010, 2011, 2012, 2013, 2014, 2015…). After years of disagreements, protests, and negotiations, in 2014 published a special rule saying that it would allow images of breast-feeding so long as they did not include an exposed nipple. This was considered a triumph by many involves, though reports continue to emerge of women having photos removed and accounts suspended despite the promise. This assurance reappears in the new version of the community standards just posted: “We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.” The Huffington Post reads this as (still) prohibiting breastfeeding photos if they include an exposed nipple, but if the structure of this sentence is read strictly, the promise to “ always” allow photos of women breast-feeding seems to me to trump the previous phrase about exposed nipples. I may be getting nitpicky here, but it’s only as a result of years of back and forth about the precise wording of this rule, and Facebook’s willingness and ability to honor it in practice.
In my own research, I have tracked the policies of major social media platforms, noting both the changes and continuities, the justifications and the missteps. One could dismiss these guidelines as mere window dressing — as a performed statement of coherent values that do not in fact drive the actual enforcement of policy on the site, which so often turns out to be more slapdash or strategic or hypocritical. I find it more convincing to say that these are statements of both policy and principle that are struggled over at times, are deployed when they are helpful and can be sidestepped when they’re constraining, and that do important discursive work beyond simply guiding enforcement. These guidelines matter, and not only when they are enforced, and not only for lending strength to the particular norms they represent. Platforms adjust their guidelines in relation to each other, and smaller sites look to the larger ones for guidance, sometimes borrowing them wholesale. The rules as articulated by Facebook matter well beyond Facebook. And they perform, and therefore reveal in oblique ways, how platforms see themselves in the role of public arbiters of cultural value. They are also by no means the end of the story, as no guidelines in the abstract could possibly line up neatly with how they are enforced in practice.
Facebook’s newest update is consistent with changes over the past few years on many of the major sites, a common urge to both impose more rules and use more words to describe them clearly. This is a welcome adjustment, as so many of the early policy documents, including Facebook’s, were sparse, abstract, and unprepared for the variety and gravity of questionable content and a awful behavior they would soon face. There are some laudable principles made explicit here. On the other hand, adding more words, more detailed examples, and further clarifications does not – cannot – resolve the other challenge: these are still rules that must be applied in specific situations, requiring judgment calls made by overworked, freelance clickworkers. And, while it is a relief to see Facebook and other platforms taking a firmer stand on issues like misogyny, rape threats, trolling, and self-harm, they often are accompanied by ever more restriction not just of bad behavior but of questionable content, a place where the mode of ‘protection’ means something quite different, much more patronizing. The basic paradox remains: these are private companies policing public speech, and are often intervening according to a culturally specific or a financially conservative morality. It is the next challenge for social media to strike a better balance in this regard: more effectively intervening to protect users themselves, while intervening less on behalf of users’ values.
This is cross-posted on the Culture Digitally blog.
We wanted to post a quick update on the status of the 2015 SMC PhD Internship Program. The application season closed January 31 and we ended up with more than 240 stellar candidates to the program. Thank you for your patience with our application process and please forgive the delays in sending an update.
The SMC was humbled and tickled pink by the quality of the applications that we received for the PhD internship this year. It’s always hard to let go of such a range of incredible work in our midsts and that made it very difficult to reach even a short list applicants to interview, let alone select three final candidates. We have reached out to finalists and are in the thick of finalizing offers. If you are reading this message and have not heard from us, until now, I’m afraid that means that we could not place you with us this year. And, due to the large numbers of applications, we cannot offer reviews of individual applications.
We will announce the 2015 PhD intern recipients in June here on the Social Media Collective blog. The 2016 PhD internship and Postdoc application rounds will open, again, in Fall 2015 with an announcement on the SMC blog.
Please know that this was an extremely competitive pool. You all are doing a LOT of amazing work out there! We very much appreciate the applications, welcome the opportunity to learn about your work, and encourage you to try, again, next year if you fit the criteria. Your applications leave us very excited about the direction of social media scholarship.
We look forward to crossing paths with you at conferences, in journal pages, and online.
Mary L. Gray (on behalf of the SMC)
In the fall of 2009, I sent the manuscript for my book Personal Connections in the Digital Age off to press. I’ve just finished the index for a second edition, which will be published by Polity in mid-2015 (in time for fall classes!). The list of terms I added provides a fun little peek into the last 5 years of digital media and social life. What would you have added? What words would you expect to see in a third edition?
FourSquare (also: Swarm)
* APPLICATION DEADLINE: JANUARY 31, 2015 *
Microsoft Research New England (MSRNE) is looking for advanced PhD students to join the Social Media Collective for its 12-week 2015 Summer Intern Program. The Social Media Collective scholars at MSRNE bring together empirical and critical perspectives to address complex socio-technical issues. Our research agenda draws on a social scientific/humanistic lens to understand the social meanings and possible futures of technology. The ideal candidate may be trained in any number of disciplines (including anthropology, communication, information studies, media studies, sociology, science and technology studies, or a related field), but should have a strong social scientific or humanistic methodological, analytical, and theoretical foundation, be interested in questions related to media or communication technologies and society or culture, and be interested in working in a highly interdisciplinary environment that includes computer scientists, mathematicians, and economists.
MSRNE internships are 12-week paid internships in Cambridge, Massachusetts. PhD interns are expected to be on-site for the duration of their internship. Primary mentors for this year will be Nancy Baym and Mary L. Gray, with additional guidance offered by our lab postdocs and visiting scholars.
PhD interns at MSRNE are expected to devise and execute a research project (see project requirements below), based on their application project proposals, during their internships. The expected outcome of an internship at MSRNE is a publishable scholarly paper for an academic journal or conference of the intern’s choosing. Our goal is to help the intern advance their own career; interns are strongly encouraged to work towards a publication outcome that will help them on the academic job market. Interns are also expected to collaborate on projects or papers with full-time researchers and visitors, contribute to the SMC blog, give short presentations, attend the weekly lab colloquia, and contribute to the life of the community through weekly lunches with fellow PhD interns and the broader lab community. While this is not an applied program, MSRNE encourages interdisciplinary collaboration with computer scientists, economists, and mathematicians.
We are looking for applicants to focus their proposals on one of the following six areas:
1) The politics of big data, algorithms, and computational culture
2) Affective, immaterial, and other frameworks for understanding digital labor
3) The social and political consequences of popular computing folklore
4) Personal relationships and digital media
5) How online technologies shape countercultures and communities of alterity
6) Histories of computing and the internet that focus on the experiences of people from marginalized social, economic, racial, or geographic groups
Applicants should have advanced to candidacy in their PhD program by the time they start their internship (unfortunately, there are no opportunities for Master’s students or early PhD students at this time). Interns will benefit most from this opportunity if there are natural opportunities for collaboration with other researchers or visitors currently working at MSRNE. Applicants from historically marginalized communities, underrepresented in higher education, and students from universities outside of the United States are encouraged to apply.
PEOPLE AT MSRNE SOCIAL MEDIA COLLECTIVE
The Social Media Collective at New England is comprised of researchers, postdocs, and visitors. For the most current list please see: This includes:
– Principal Researcher Nancy Baym (http://www.nancybaym.com/)
– Senior Researcher Mary L. Gray (http://marylgray.org/)
– Postdoctoral Researcher Kevin Driscoll (http://kevindriscoll.info/)
– Postdoctoral Researcher Jessa Lingel (http://jessalingel.tumblr.com/)
For a complete list of all permanent researchers and current postdocs based at the New England lab see:
Which is: http://research.microsoft.com/en-us/labs/newengland/people/bios.aspx
Previous MSRNE interns in the collective have included Amelia Abreu (UWashington, information), Jed Brubaker (UC-Irvine, informatics), Jade Davis (University of North Carolina, communication), Brittany Fiore-Silfvast (University of Washington, communication), Scott Golder (Cornell, sociology), Germaine Halegoua (U. Wisconsin, communications), Tero Karppi (University of Turku, media studies), Airi Lampinen (HIIT, information), Jessa Lingel (Rutgers, library and information science), Joshua McVeigh-Schultz (University of Southern California, interactive media), Alice Marwick (NYU, media culture communication), Jolie Matthews (Stanford, learning sciences), Tressie McMillan Cottom (Emory, sociology), Laura Noren (NYU, sociology), Jaroslav Svelch (Charles University, media studies), Katrin Tiidenberg (Tallinn University, Institute of International and Social Studies), Shawn Walker (UWashington, information), Omar Wasow (Harvard, African-American studies), Sarita Yardi (GeorgiaTech, HCI), and Kathryn Zyskowski (University of Washington, anthropology).
For more information about the Social Media Collective, visit our blog: http://socialmediacollective.org/
To apply for a PhD internship with the social media collective:
1. Fill out the online application form: https://research.microsoft.com/apps/tools/jobs/intern.aspx
Make sure to indicate that you prefer Microsoft Research New England and “social media” or “social computing.” You will need to list two recommenders through this form. Make sure your recommenders respond to the request for letters so that their letters are also submitted by the deadline.
You will need to include:
a. A brief description of your dissertation project.
b. An academic article-length manuscript that you have written (published or unpublished) that demonstrates your writing skills.
c. A copy of your CV.
d. A pointer to your website or other online presence (if available).
e. A short description (no more than 2 pages, single spaced) of 1 or 2 projects that you propose to do while interning at MSRNE, independently and/or in collaboration with current SMC researchers. The project proposals can be related to but must be distinct from your dissertation research. Be specific and tell us: 1) What is the research question animating your proposed project? 2) What methods would you use to address your question? 3) How does your research question speak to the interests of the SMC? and 4) Who do you hope to reach (who are you engaging) with this proposed research? This is important – we really want to know what it is you want to work on with us and we need to know that it is not, simply, a continuation of your dissertation project.
We will begin considering internship applications on Feb 1, and will not consider late applications.
PREVIOUS INTERN TESTIMONIALS
“The internship at Microsoft Research was all of the things I wanted it to be – personally productive, intellectually rich, quiet enough to focus, noisy enough to avoid complete hermit-like cave dwelling behavior, and full of opportunities to begin ongoing professional relationships with other scholars who I might not have run into elsewhere.”
— Laura Noren, Sociology, New York University
“If I could design my own graduate school experience, it would feel a lot like my summer at Microsoft Research. I had the chance to undertake a project that I’d wanted to do for a long time, surrounded by really supportive and engaging thinkers who could provide guidance on things to read and concepts to consider, but who could also provoke interesting questions on the ethics of ethnographic work or the complexities of building an identity as a social sciences researcher. Overall, it was a terrific experience for me as a researcher as well as a thinker.”
— Jessica Lingel, Library and Information Science, Rutgers University
“Spending the summer as an intern at MSR was an extremely rewarding learning experience. Having the opportunity to develop and work on your own projects as well as collaborate and workshop ideas with prestigious and extremely talented researchers was invaluable. It was amazing how all of the members of the Social Media Collective came together to create this motivating environment that was open, supportive, and collaborative. Being able to observe how renowned researchers streamline ideas, develop projects, conduct research, and manage the writing process was a uniquely helpful experience – and not only being able to observe and ask questions, but to contribute to some of these stages was amazing and unexpected.”
— Germaine Halegoua, Communication Arts, University of Wisconsin-Madison
“Not only was I able to work with so many smart people, but the thoughtfulness and care they took when they engaged with my research can’t be stressed enough. The ability to truly listen to someone is so important. You have these researchers doing multiple, fascinating projects, but they still make time to help out interns in whatever way they can. I always felt I had everyone’s attention when I spoke about my project or other issues I had, and everyone was always willing to discuss any questions I had, or even if I just wanted clarification on a comment someone had made at an earlier point. Another favorite aspect of mine was learning about other interns’ projects and connecting with people outside my discipline.”
–Jolie Matthews, Education, Stanford University