I recently published a pair of articles with Katie Shilton exploring how mobile app developers help each other learn what privacy means and how to build that abstract value into their software. Katie and I analyzed hundreds of forum conversations about privacy among iOS and Android developers, and compared the different development cultures and privacy values that arose around each platform.
Our first piece, in the Journal of Business Ethics, explores the work practices that trigger privacy conversations in different development ecosystems. We found that
The rules, regulations, and cultural norms that govern each ecosystem impact day-to-day work practices for mobile developers. These differing work practices in turn shape the ethical deliberations engaged in by forum participants, addressing the question of why privacy is debated—and ultimately designed for—so differently between the two ecosystems.
Ultimately, even though the prompts for ethical debates among developers differed widely between iOS and Android, the tactics they used to convince each other that privacy was an important problem for design were remarkably similar. This means that Apple and Google are important privacy regulators, because the way they structure their development environments influences when and how app developers think about privacy.
Our second piece, in New Media & Society, is more interested in exactly that question: how Google and Apple regulate the thousands of app developers who do not work for them, how platform values become developer values. Platforms need apps to attract users, but developers have to play by platforms’ rules and learn their values in order to get to market. Our close study of developer cultures helped answer an important questionin privacy research, namely: Why do Android apps leak so much more consumer data than iOS apps? We show that Apple’s close, but, to developers, somewhat capricious, regulation of submitted apps leads to an ‘invisible fence’ effect where developers are constantly working together to figure out what Apple means by ‘privacy’ so that they can get their apps approved and into the market. In contrast, Android’s relatively open and lax development environment leads to a Wild West atmosphere where anything goes and where developers work together with highly-skilled users to build defensive measures against perceived threats—including Google. While average users might be left out of this digital arms race, that’s more of a feature than a bug for Android app developers:
For devs and skilled hobbyists, Android enabled access to privacy-enhancing applications, limited only by skill and literacy. The charms—but also the underlying inaccessibility—of Android privacy were summed up by mavenz in a 2013 thread about privacy problems in the Facebook app: “Whatevs. This is why i < 3 android. I can just hack something better.”
As researchers interested in ethical challenges in new media environments, we also reflected on the ethics of researching public forums. And we hope we can help move that important methodological conversation forward as well.