Architecture or Minecraft?

(or, “Hopefully 4chan Won’t Hear About This Contest”)

The social-media-ification of everything continues. If you’ve got time for some late-summer procrastination, thanks to the Internet you can choose the design of my house.

As you may have read here two weeks ago, I’m crowdsourcing it. The first competition is over and I received 16 entries — above average for That means anyone on the Internet can now help pick a winner. I’d say there are some great designs and many awful ones.

My needs attracted designers from Nigeria, Bulgaria, Ukraine, Romania, Vietnam, Mexico, and Indonesia. But also London, Texas, and my very own town of Ann Arbor, Michigan. Submissions are anonymous, but Arcbazar maps their self-reported locations:
arcbazar map.png

Anyone can submit–no credentials required. So far I don’t think it’s “the worst thing to happen to architecture since the Internet started” but there’s still plenty of time for this to go sideways on me. The next step is voting.

In Ayn Rand’s The Fountainhead, the young architect Howard Roark says, “I don’t intend to build in order to have clients. I intend to have clients in order to build.” Like Rand’s protagonist, I think some of my designers refused to compromise their unique vision. To give you the flavor, here are some comments my friends made about the low points:

“This house looks like the head of a Minecraft pig”:



For reference:



We asked for a barn-like building with a gambrel roof. That was a requirement. To write this requirement, “gambrel” is a word I had to look up. Google says:

gambrel roof

I think some of the designers really struggled with it! A friend said: “It looks like this building fell down and broke its spine.”

broken roof


“This appears to be a car dealership.”:

arcbazar concept


You can help choose the winner here: (You need to sign up for a free login.)

There are two separate things to do at this link — voting and commenting. Anyone with an arcbazar login can vote: it’s a numerical rating in five categories.

To vote click “Vote now!” when you are looking at a particular entry. This affects the rankings.

arcbazar vote now link


To comment and to read other people’s comments, click the word “Evaluations” when you are looking at a particular entry. You need a Facebook login to add a comment.

arcbazar evaluations link


Stay tuned. More updates here as the process unfolds.


I crowdsourced the design of my house

(or, “The Social-Media-ification of Everything”)

The architecture crowdsourcing Web site Arcbazar has been called “The Worst Thing to Happen To Architecture Since the Internet Started.” The site also got some press recently by running a parallel, unauthorized architecture competition for the “people’s choice” for the design of the Obama Presidential Library.

arcbazar screen shot home page
The arcbazar welcome page. (click to enlarge)

I’ve decided to use to run two architectural competitions for my house. My competitions started yesterday (links below), in case you want to see this play out in real time.

Most of the attention given to arcbazar has been about labor, safety, and value. Discussion has centered around possible changes to the profession of architecture. Does it lower standards? Will it put architecture jobs and credentials in jeopardy?

Yet as a social media researcher the part of arcbazar that has my attention is what I would call the “social media-ification of everything.”

Anyone with a free arcbazar account can submit a design or act as a juror for submitted designs, and as the Web site has evolved it has added features that evoke popular social media platforms. Non-architects are asked to vote on designs, and the competitions use familiar social media features and metaphors like a competition “wall.”

Here are my competitions. You need a free account to look at them.

This means YOU could design my house, so please choose wisely. (One friend said: “You realize your house is going to be renamed Housey McHouseFace.”) Keep your fingers crossed for me that this works out well. Some of the submitted designs for past competitions are a little… odd…

obama building shaped like obamas name
Who wouldn’t want a house in the shape of their own name? (click to enlarge)

Three flawed assumptions the Daily Beast made about dating apps

Image from @Cernovich

Last week, the Daily Beast published an article by one of its editors who sought to report about how dating apps were facilitating sexual encounters in Rio’s Olympic Village. Instead, his story focused mainly on athletes using Grindr, an app for men seeking men, and included enough personal information about individuals to identify and out them. After the article was criticized as dangerous and unethical across media outlets and social media, the Daily Beast replaced it with an apology. However, decisions to publish articles like this are made based on assumptions about who uses dating apps and how people share information on them. These assumptions are visible not only in how journalists act but also in the approaches that researchers and app companies take when it comes to users’ personal data. Ethical breeches like the one made by the Daily Beast will continue unless we address the following three (erroneous) assumptions:

Assumption 1. Data on dating apps is shareable like a tweet or a Facebook post

 Since dating apps are a hybrid between dating websites of the past and today’s social media, there is an assumption that the information users generate on dating apps should be shared. Zizi Papacharissi and Paige Gibson[1] have written about ‘shareability’ as the built-in way that social network sites encourage sharing and discourage withholding information. This is evident within platforms like Facebook and Twitter, through ‘share’ and ‘retweet’ buttons, as well as across the web as social media posts are formatted to be easily embedded in news articles and blog posts.

Dating apps provide many spaces for generating content, such as user profiles, and some app architectures are increasingly including features geared toward shareability. Tinder, for example, provides users with the option of creating a ‘web profile’ with a distinct URL that anyone can view without even logging into the app. While users determine whether or not to share their web profiles, Tinder also recently experimented with a “share” button allowing users to send a link to another person’s profile by text message or email. This creates a platform-supported means of sharing profiles to individuals who may never have encountered them otherwise.

The problem with dating apps adopting social media’s tendency toward sharing is that dating environments construct particular spaces for the exchange of intimate information. Dating websites have always required a login and password to access their services. Dating apps are no different in this sense – regardless of whether users login through Facebook authentication or create a new account, dating apps require users to be members. This creates a shared understanding of the boundaries of the app and the information shared within it.  Everyone is implicated in the same situation: on a dating app, potentially looking for sexual or romantic encounters. A similar boundary exists for me when I go to the gay bar; everyone I encounter is also in the same space so the information of my whereabouts is equally as implicating for them. However, a user hitting ‘share’ on someone’s Tinder profile and sending it to a colleague, family member, or acquaintance removes that information from the boundaries within which it was consensually provided. A journalist joining a dating app to siphon users’ information for a racy article flat out ignores these boundaries.

Assumption 2. Personal information on dating apps is readily available and therefore can be publicized

 When the Daily Beast’s editor logged into Grindr and saw a grid full of Olympic athletes’ profiles, he likely assumed that if this information was available with a few taps of his screen then it could also be publicized without a problem. Many arguments about data ethics get stuck debating whether information shared on social media and apps is public or private. In actuality, users place their information in a particular context with a specific audience in mind. The violation of privacy occurs when another party re-contextualizes this information by placing it in front of a different audience.

Although scholars have pointed out that re-contextualization of personal information is a violation of privacy, this remains a common occurrence even across academia. We were reminded of this last May when 70,000 OkCupid users’ data was released without permission by researchers in Denmark. Annette Markham’s post on the SMC blog pointed out that “the expectation of privacy about one’s profile information comes into play when certain information is registered and becomes meaningful for others.” This builds on Helen Nissenbaum’s[2] notion of “privacy in context” meaning that people assume the information they share online will be seen by others in a specific context. Despite the growing body of research confirming that this is exactly how users view and manage their personal information, I have come across many instances where researchers have re-published screenshots of user profiles from dating apps without permission. These screenshots are featured in presentations, blog posts, and theses with identifying details that violate individuals’ privacy by re-contextualizing their personal information for an audience outside the app. As an academic community, we need to identify this as an unethical practice that is potentially damaging to research subjects.

Dating app companies also perpetuate the assumption that user information can be shared across contexts through their design choices. Recently, Tinder launched a new feature in the US called Tinder Social, which allows users to join with friends and swipe on others to arrange group hangouts. Since users team up with their Facebook friends, activating this feature lets you see everyone else on your Facebook account who is also on Tinder with this feature turned on. While Tinder Social requires users to ‘unlock’ its functionality from their Settings screen, its test version in Australia automatically opted users in. When Australian users updated their app, this collapsed a boundary between the two platforms that previously kept the range of family, friends, and acquaintances accumulated on Facebook far, far away from users’ dating lives. While Tinder seems to have learned from the public outcry about this privacy violation, the company’s choice to overlap Facebook and Tinder audiences disregards how important solid boundaries between social contexts can be for certain users.

 Assumption 3. Sexuality is no big deal these days

 At the crux of the Daily Beast article was the assumption that it was okay to share potentially identifying details about people’s sexuality. As others have pointed out, just because same-sex marriage and other rights have been won by lesbian, bisexual, gay, trans, and queer (LGBTQ) people in some countries, many cultures, religions, and political and social groups remain extremely homophobic. Re-contextualization of intimate and sexual details shared within the boundaries of a dating app not only constitutes a violation of privacy, it could expose people to discrimination, abuse, and violence.

In my research with LGBTQ young people, I’ve learned that a lot of them are very skilled at placing information about their sexuality where they want it to be seen and keeping it absent from spaces where it may cause them harm. For my master’s thesis, I interviewed university students about their choices of whether or not to come out on Facebook. Many of them were out to a certain degree, posting about pro-LGBTQ political views and displaying their relationships in ways that resonated with friendly audiences but eluded potentially homophobic audiences like coworkers or older adults.

In my PhD, I’ve focused on how same-sex attracted women manage their self-representations across social media. Their practices are not clear-cut since different social media spaces mean different things to users. One interviewee talked about posting selfies with her partner to Facebook for friends and family but not to Instagram where she’s trying to build a network of work and church-related acquaintances. Another woman spoke about cross-posting Vines to friendly LGBTQ audiences on Tumblr but keeping them off of Instagram and Facebook where her acquaintances were likely to pick fights over political issues. Many women talked about frequently receiving negative, discriminatory, and even threatening homophobic messages despite these strategies, highlighting just how important it was for them to be able to curate their self-representations. This once again defies the tendency to designate some sites or pieces of information as ‘public’ and others as ‘private.’ We need to follow users’ lead by respecting the context in which they’ve placed personal information based on their informed judgments about audiences.

Journalists, researchers, and app companies frequently make decisions based on assumptions about dating apps. They assume that since the apps structurally resemble other social media then it’s permissible to carry out similar practices tending toward sharing user-generated information. This goes hand-in-hand with the assumption that if user data is readily available, it can be re-contextualized for other purposes. On dating apps, this assumes (at best) that user data about sexuality will be received neutrally across contexts and at its worst, this data is used without regard for the harm it may cause. There is ample evidence that none of these assumptions hold true when we look at how people create bounded spaces for exchanging intimate information, how users manage their personal information in particular contexts, and how LGBTQ people deal with enduring homophobia and discrimination. While the Daily Beast should not have re-contextualized dating app users’ identifying information in its article, this instance provides an opportunity to dispel these assumptions and change how we design, research, and report about dating apps in order to treat users’ information more ethically.


 [1] Papacharissi, Z., & Gibson, P. L. (2011). Fifteen minutes of privacy: Privacy, sociality and publicity on social network sites. In S. Trepte & L. Reinecke (Eds.), Privacy Online (pp. 75–89). Berlin: Springer.

[2] Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford, CA: Standford University Press.

How machine learning can amplify or remove gender stereotypes

TLDR: It’s easier to remove gender biases from machine learning algorithms than from people.

In a recent paper, Saligrama, Bolukbasi, Chang, Zou, and I stumbled across some good and bad news about Word Embeddings. Word Embeddings are a wildly popular tool of the trade among AI researchers. They can be used to solve analogy puzzles. For instance, for man:king :: woman:x, AI researchers celebrate when the computer outputs xqueen (normal people are surprised that such a seemingly trivial puzzle could challenge a computer). Inspired by our social scientist colleagues (esp. Nancy Baym, Tarleton Gillespie and Mary Gray), we dug a little deeper and wrote a short program that found the “best” he:x :: she:y analogies, where best is determined according to the embedding of common words and phrases in the most popular publicly available Word Embedding (trained using word2vec on 100 billion words from Google News articles).

The program output a mixture of x-y pairs ranging from definitional, like brother-sister (i.e. he is to brother as she is to sister), to stereotypical, like blue-pink or guitarist-vocalist, to blatantly sexist, like surgeon-nurse, computer programmer-homemaker, and brilliant-lovely. There were also some humorous ones like he is to kidney stone as she is to pregnancysausages-buns, and WTF-OMG. For more analogies and an explanation of the geometry behind them, read more below or see our paper, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings.

Bad news: the straightforward application of Word Embeddings can inadvertently *amplify* biases. These Word Embeddings are being used in increasingly many applications. Among the countless papers that discuss Word Embeddings for use in searching the web, processing resumes, chatbots, etc., etc., hundreds of articles mention the king-queen analogy while none of them notice the blatant sexism present.

Say someone searches for computer programmer. A nice paper has shown how to improve search results using the knowledge in Word Embeddings that the term computer programmer is related to terms like javascript. Using this, search results containing these related terms can bubble up and it was shown that the average results of such a system are statistically more relevant to the query.  However, it also happens that the name John has a stronger association with programmer than the name Mary. This means that, between two identical results that differed only in the names John/Mary, John’s would be ranked first. This would *amplify* the statistical bias that most programmers are male by moving the few female programmers even lower in the search results.

Now you might think that we could solve this problem by simply removing names from embeddings – but there are still subtle indirect biases: the term computer programmer is also closer to baseball than to gymnastics, and as you can imagine, removing names wouldn’t entirely solve the problem.

Good news: biases can easily be reduced/removed from word embeddings. With a touch of a button, we can remove all gender associations between professions, names, and sports in a word embedding. In fact, the word embedding itself captures these concepts so you only have to give a few examples of the kinds of associations you want to keep and the kind you want to remove, and the machine learning algorithms do the rest. Think about how much easier this is for a computer than a human. Men and women have all been shown to have implicit gender associations. And the Word Embeddings also surface shocking gender associations implicit in the text on which they were trained.

People can try to ignore these associations when doing things like evaluating candidates for hiring, but it is a constant uphill battle. A computer, on the other hand, can be programmed to remove associations between different sets of words once, and with ease it will continue along with its work. Of course, we machine learning researchers still need to be careful — depending on the application, biases can creep in other ways. Also, I mention that we are providing tools that others can use to define, remove, negate, but also possibly even amplify biases as they choose for their applications.

As machine learning and AI become ever more ubiquitous, there have been growing pubic discussions about the social benefits and possible dangers of AI. Our research gives insight into a concrete example where a popular, unsupervised machine learning algorithm, when trained over a large corpus of text, reflects and crystallizes the stereotypes in the data and in our society. Wide-spread adoptions of such algorithms can greatly amplify such stereotypes with damaging consequences. Our work highlights the importance to quantify and understand such biases in machine learning and also how machine learning algorithms may be used to reduce bias.

Future work: This work focused on gender biases, specifically male-female biases, but we are now working on techniques for identifying and removing all sorts of biases such as racial biases from Word Embeddings.

Why I Am Suing the Government — Update

[This is an old postSEE ALSO: The most recent blog post about this case.]

Last month I joined other social media researchers and the ACLU to file a lawsuit against the US Government to protect the legal right to conduct online research. This is newly relevant today because a community of devs interested in public policy started a petition in support of our court case. It is very nice of them to make this petition. Please consider signing it and sharing this link.

PETITION: Curiosity is (not) a crime

For more context, see last month’s post: Why I Am Suing the Government.