Today at the Berkman Center, our summer PhD Interns gave a series of short talks describing our research and asking for feedback from the Berkman community. This liveblog summarizes the talks and the Q&A (special thanks to Willow Brugh for collaborating on this post).
Mary Gray, senior researcher at Microsoft Research, opened up the conversation by sharing more about the PhD internship. “We need folks who can do bridge work, who can work between university and industry settings.” Each students’ projects is taking a tack that is less common; there’s mostly a social-critical approach. Our group is particularly focused on showing the value of showing the value of methodologies are less familiar in industry settings. It’s a twelve-week program, and it doesn’t always happen in the summer. “We’re always interested in people who want to take a more critical/qualitative approach. We have labs all over the world, and each lab accepts up to 100 PhD students to do this kind of work,” Mary says.
Microsoft research is (sadly) unique in that everything a student does is open for public consumption, says Mary. PhD students are encouraged to do work that feeds academic conversations while also potentially connecting with product groups that could benefit from that insight.
Quantified Self: The Hidden Costs of Knowledge
What are the privacy ramifications of our voracious appetite for data, what are the challenges of interpreting it, and how might data be employed to widen inequality? Ifeoma Ajunwa is a 5th year PhD candidate in Sociology at Columbia University. Recurring themes in her research include inequality, data discrimination and emerging bioethics debates arising from the exploitation of Big Data.
“Almost everything we do generates data,” Ifeoma quotes Gary Wolf’s WIRED Magazine article on quantified self. And yet this kind of data collection can be a form of surveillance; companies can also often crawl this data from the Internet and use it to feed algorithms that influence our lives. In this backdrop, people are also collecting data about themselves through the Quantified Self movement– data that could also be captured by these companies and used for purposes beyond our consent.
How can our data be used against us? Kate Crawford noted in a recent Atlantic article that this data has been used in courtrooms. Ifeoma also expresses worries that this data could be used against people as companies use it to limit their own risk. The “quantified self” has a dual meaning. On one hand, it refers to the self knowledge that comes from that data. On the other, this idea could turn against people as institutions set policies based on that data that widen inequality.
Ifeoma describes criminal records as a “modern day scarlet letter” that ensures that people are omitted from opportunities. In genetics, Ifeoma describes the idea of “genetic coercion.”
In her summer research with Kate Crawford at MSR, Ifeoma is looking at the quantification of work. Unlike Taylorism, where the focus was on breaking down the job task itself, the focus now is on “the individual worker’s body” and “inducing the worker to master their own body for the benefit for the corporation.” In this “surveillance-innovation complex,” companies try to evade regulation by seeking protections for innovation. They’re looking specifically at workplace health programs that include health, diet, exercise. These programs track weight, spending habits, etc. Ifeoma is looking at what companies track and how the interpretation of this data can impact the workers it’s generated from.
She concludes by asking us, “How can we make technology work for us, rather than against us? Could we harness large and small data without it increasing divides and discrimination?
News Feed: Created By You?
How do people enact privacy? Stacy Blasiola usually asks in her research. When you’re posting something online while at a bar, are you thinking about who sees it? This focus misses out on the role that platforms play in this work, a focus she’s taking on at MSR.
Stacy Blasiola is a PhD candidate at the University of Illinois at Chicago and a National Science Foundation IGERT Fellow in Electronic Security and Privacy.
This summer, Stacy will be looking at the Facebook NewsFeed algorithm. She talks about the Facebook Tips page, where Facebook provides information on how to find out who your friends are and how the NewsFeed works. Stacy shows us several videos that they’ve posted under “NewsFeed: created by you”. These videos were promoted by Facebook across their users, and they received millions of video views.
Tim: “I made my News Feed about wellness, nutrition and living my best.” Create a News Feed that inspires you.
Posted by Facebook Tips on Tuesday, December 2, 2014
Stacy has been looking at the relationship between the videos and the comments… “Surrounding myself with… knowledge and expertise. I want to know what you know.” “I look forward to seeing my best self every day.”
According to Facebook, Tim is solely responsible for what he sees in his feed. Stacy has been looking at the discourse used by users and platforms to ask, “how do the platforms matter to the users?” When users commented on these videos, Facebook often posted official comments.
One user said: “This leads me to believe I have control over my own feed. I don’t. FB is constantly making things disappear and rearranging the timeline.”
The company’s response changes depending on the type of questions asked. For example, “Why do I keep getting old posts? Well, people are posting a lot on it, so it resurfaces”… Facebook uses linguistic gymnastics to not say “we’re doing this.”
- Stacy is at the very beginning stages of this project, and hopes to carry out the following kinds of analysis:
- How do users discuss the news feed algorithm?
- How does Facebook position the news feed to these users? Especially, where do they place responsibility?
- How do users talk about the news feed to each other?
What Does It Mean to Hold Crowds Accountable To The Public?
Nathan Matias is a PhD Candidate at the MIT Center for Civic Media/MIT Media Lab, and a Berkman fellow, where he designs and researches civic technologies for cooperation and expression.
I was onstage at this point, but here’s a basic summary of the talk. After posing the question “How do we hold crowds accountable to the public?” I described common mechanisms that we imagine as forms of accountability: pressure campaigns, boycots, and elections, legislation, etc. I then described three cases where these mechanisms seemed unable to address forms of collective power we see online:
In the case of Peer Production, people sometimes petition Jimmy Wales, somehow believing that he has the power to change things. Other times, op ed writers make public appeals to “Wikipedia” or “Wikipedians” to address some systematic problem. I described my work with Sophie Diehl on Passing On, a system that uses infographics to appeal to public disappointment and then channels that disappointment into productive change on Wikipedia (more in this article).
.@natematias: gender imbalance in nytimes and Wikipedia. How do people hold platforms accountable? #berkman pic.twitter.com/TFDdvz7EPK
— Berkman Center (@berkmancenter) June 16, 2015
In the case of Social Networks, we sometimes criticize companies for things that are also partly attributable to who we accept as friends or what we personally choose. This debate is especially strong in discussions over information diversity. I shared an example from Facebook’s recent study on exposure to diverse information, outlining their attempt to differentiate between available media sources, friend recommendations, personal choices, and the NewsFeed algorithm. I also described my work with Sarah Szalavitz on FollowBias, a system for measuring and addressing these more social influences..
Finally, I described work on distributed decisionmaking, whether the decisions of digital laborers who do the work of content moderation online. I described my recent collaboration on a research project describing the process of reviewing, reporting, and responding to harassment online. I also described upcoming work to study the work of moderators on Reddit.
How Do Gaming Communities Make Sense of Their Personal Place and Role in Massive Worlds?
What is it like to be an EVE Online player? Aleena opens up by showing us videos of massive online space battles in this massively online multiplayer game.
Aleena Chia is a Ph.D. Candidate in Communication and Culture at Indiana University currently interning at Microsoft Research, where she investigates the affective politics and moral economics of participatory culture, in the context of digital and live-action game worlds.
Aleena’s research on consumer culture tends to focus on gaming activities. How they make sense of their role in these massive worlds? Her argument is that users make sense of their experience through the alignment of spectacle, alcohol, and experience at brand fests and the gameplay experience itself. They feel that they’re truly a part of something larger than themselves.
How do they make sense of their hours spent on this? They spend hours and hours each week building up empires. This experience is made sensible through reward and reputation systems, sometimes designed by the companies, and sometimes by the communities themselves. How do players make sense of the time they’ve invested into their identities as gamers in Eve but also beyond? They make sense of this through conversations about work-life balance, as well as the recognition by others that their work has cultural, economic, and social value.
At the heart of this are <strong>Compensatory drives</strong> – use things to add up and even out. Get what’s coming to them. (Re)compense is connected to an idea of balance, a moral equilibrium. These “compensatory forces” give people a connection to the intangible world, to have a sense of fairness and justice, and a sense of aesthetic, economic, and social legitimacy.
EVE Online is a hypercapitalist world with no governments — warfare, murder, and theft are sanctioned if you can get away with it. But there is also a democratically elected player council, consultants to the game company, who talk to developers and the company. The savage world is managed through civilized processes.
Can player representatives effectively consult with the company? Players have very micro concerns, while developers often have macro-level concerns for all the players. Within these systems, there are some mechanisms of accountability — if they don’t do well, they won’t get elected. Players often complain to them on forums, email, and at meet-ups. But they also don’t have much power.
To understand this, Aleena will be looking at minutes from meetings between the council members and the company, as well as the council members and the player base. She’ll also be looking at town hall meeting logs, election campaigns materials and responses. She’ll be asking, how do they see their roles in relationship to each other? She’ll also look at how players learn to be council members of terms of office by examining meeting minutes. Finally, Aleena will be mapping feedback channels, mechanisms, directions, and ruptures– both formal and informal mechanisms. Feedback doesn’t just run up the chain from players to developers through the consultants; it also runs down. Consultants have a job (either implicit or explicit) to advocate for the company and “spread goodwill to the masses.”
If the election of player councils is one example of a democratic process between audiences and brands (perhaps related to reality tv shows with audience feedback. Now we have tribunals). Is this market populism (neoliberalism at work, a replacement of authentic democratic engagement)? Might it instead be consumer co-creation (customer relations, commoditized into a pleasurable and branded experience) – not just about making the experience better, but your experience as a consumer). Lastly, designers often say that users don’t know what they want — discounts popular will.
Finally Aleena is asking, “how are these democratic mechanisms changing the means and meanings of consumption?”
Questions, Answers, Comments
Ethan : How crowds develop or are simply different from users or patients or…? Wikipedia has a crowd, but how do you distinguish from other groups.
- Nathan: You’ve likely thought of this in your dispute resolution research. We might think of individuals or institutions. Using “crowd” as a placeholder for something we don’t quite know where to apply the lever to change things. Cumulative effect of the social choices and friendships we have in a network. Or it might be more identifiable.
Rebecca : What is your model of genuine civic engagement which neoliberalism has surplanted?
- Aleena: My utopia is a participatory democracy. But my own is not official political systems, but how can the media open up space for the public to participate? Engagement with the media via certain mechanisms creates real decision making power in the system and the content.
Tarleton: How do people think about their role in relation to community. But the world is meant to be something– Eve is clear about this, as is WIkipedia, and Facebook is sort of getting there. Not just governance problems, but the narrative claim of the institution masking or distorting the style of engagement?
- Stacy: My project stacked against Nathan’s shows two different aspects of the same problem. “Algorithm” as a single thing to be tweaked to fix everything. But that is not something I know. Transparency is seen as something severely lacking. How does Facebook present in order to shape the reality. “We” in publicity, “you” in user interactions. Depends on the audiences they’re speaking to.
- Nathan: I draw inspiration from Hochschild’s research on airline attendants. There was a clear corporate brand identity concern influencing how how airline attendants were trained to respond to tough situations. Training wasn’t just about what to do, but how to be. Like Hochschild’, I’m also looking at the process of learning to be a worker. There are job boards on Reddit where people apply and chat. Reddit has basic rules overall, but each /r also has special rules. I’m looking at how moderators look at their roles in their /r as well as at Reddit.com
- Aleena: The word between corporate and users — classic customer feedback, filter them, see what makes sense, incorporate. But we also want to persuade users that we’re on the same page, no “us vs them.” It’s not just about the bottom line — want there to be engagement. Eve doesn’t just want you to be happy. They want you to strive and have troubles.
- Ifeoma: Are governance of wellness programs actually voluntary? People aren’t voting about the shape the wellness program will take, only that it will exist. It’s about shifting the responsibility onto the individual worker. No real discussion if the work infrastructure can be shaped to achieve the same thing. Corporation abdicating its responsibility for a healthier worker, putting it on the worker. We’re worried about what that means, in the structural constraints inhibiting them. What will the new workplace discrimination be? It’s perfectly ok for your employer to fire you if you’re a smoker outside the workplace. Level of coercion. Up to 30% or 50% of the program covered for smoker cessation.
Mary Gray: Across these talks, there has been an implicit appeal that a social need or desire work outside of market demands. Want to keep players playing, keep newsfeed functioning in a certain way, etc. Market demands the corporations do something external to what the players/users/etc want. Why are we seeking corporate good? What sends us to the corporation to fix these things, seeing them as the path of recourse?
- Stacy: Inherent expectations from users that Facebook be “truthful.” Christian Sandvig did work that shows users feel confused, lied to “I thought that person just didn’t like me.” Users rely on FB to maintain social connections. When those assumptions aren’t met, tehre is anger. Comments of “why are you doing this to me?” Other people say “I have to be friends with someone because of business or whatever, but I don’t want to see their posts.” Gatekeeping is not new — but we don’t know how they’re doing this process.
- Aleena: Players look to Eve for social justice because the company thinks it makes good sense. Difference between votes at face value and adapting to mass player will. Do have to come up with something new, even if it’s not the thing that was asked for.
- Nathan: Wikipedia and Reddit are sorts of counterexamples to Eve or Facebook in that participants and active contributors may feel that it is a public good. Wikimedia, as a nonprofit, is funded by donations, has elected board members, and can be thought of as accountable to its participants. But when people who are not contributors are affected by its power, they may take traditional routes (such as petitioning Jimmy Wales). Reddit is more complex. When Reddit started banning /r for specific behavior they saw instead of its traditional hands-off model, this question of interests started to crack. Advertising and Reddit Gold are perhaps competing income models — when a fundraising goal is met, they buy a new server. But Reddit the company also is starting to take more top-down responsibility for what its users do, which makes them look more like other corporations.
- Ifeoma: Relinquishing the rights of social good to corporations has to do with complex problems and simple solutions. Wellness programs try to address American lifestyles being unhealthy (sitting, eating, smoking, don’t work out) — which is complex both in lifestyle and in infrastructure. Trying to fix with something as simple as a wellness program won’t have the intended results. AND has unintended results (discrimination especially). Laws don’t cover obesity or smoking, which are stigmatized, and encroach on the rights of the worker.
Nick Seaver: What is happening to audiences, citizens, workforces, etc as different publics — are you helping to defining what each of those things means? My own work has been undermined because I didn’t define that.
- Aleena: What is the value of comparing it to things which have come in the past? Something IS different in connected space. Not just how democracy and society are changing, but how are the meanings of consumption changing? Video games even… you can do so many more things — you’re supposed to buy things and make friends and etc. “What kind of player are you?” Identities are tied to this. How does this jigsaw piece connect to the rest of it?
- Ifeoma: Historical context is really important, especially in defining what a worker is. A defining thing is technology being available. Workplace surveillance isn’t new. What is new are the advances in technology, letting us survey and track the worker in a way which wasn’t already available. Collapses the line between work and no work. Woman fired (and sued) for deleting an app on her phone — which couldn’t be turned off, and was tracking her (how fast she was driving, where she was at over the weekend, etc). Unforeseen issues — need to redefine what it means to be a worker.
- Nathan: Kevin Driscoll and I have this debate about BBSs. We talk about them with a set goal and solid definitions. People do the same for Twitter and Reddit etc. And yet, in moderation work, there are common experiences, expectations, and tools. These moderators have to figure out how to work at that intersection between what a company with <100 employees are defining as the space of their work. Postigo explored this somewhat in his research on AOL community leaders — the more AOL did to control, track, standardize community leaders’ work, the more those folk thought of themselves as a collective and like unpaid workers. So I’m still looking for the language and theories to describe this.
- Stacy: I’m interested in the idea of When Technologies Were New — how people interacted with tech when things were new. The more we do with research, the more we realize nothing is new. In that sense, my dissertation isn’t just about Facebook, about algorithms at large. Society from a larger view — how do we understand the mediations happening?