The Facebook “It’s Not Our Fault” Study

Today in Science, members of the Facebook data science team released a provocative study about adult Facebook users in the US “who volunteer their ideological affiliation in their profile.” The study “quantified the extent to which individuals encounter comparatively more or less diverse” hard news “while interacting via Facebook’s algorithmically ranked News Feed.”*

  • The research found that the user’s click rate on hard news is affected by the positioning of the content on the page by the filtering algorithm. The same link placed at the top of the feed is about 10-15% more likely to get a click than a link at position #40 (figure S5).
  • The Facebook news feed curation algorithm, “based on many factors,” removes hard news from diverse sources that you are less likely to agree with but it does not remove the hard news that you are likely to agree with (S7). They call news from a source you are less likely to agree with “cross-cutting.”*
  • The study then found that the algorithm filters out 1 in 20 cross-cutting hard news stories that a self-identified conservative sees (or 5%) and 1 in 13 cross-cutting hard news stories that a self-identified liberal sees (8%).
  • Finally, the research then showed that “individuals’ choices about what to consume” further limits their “exposure to cross-cutting content.” Conservatives will click on only 17% a little less than 30% of cross-cutting hard news, while liberals will click 7% a little more than 20% (figure 3).

My interpretation in three sentences:

  1. We would expect that people who are given the choice of what news they want to read will select sources they tend to agree with–more choice leads to more selectivity and polarization in news sources.
  2. Increasing political polarization is normatively a bad thing.
  3. Selectivity and polarization are happening on Facebook, and the news feed curation algorithm acts to modestly accelerate selectivity and polarization.

I think this should not be hugely surprising. For example, what else would a good filter algorithm be doing other than filtering for what it thinks you will like?

But what’s really provocative about this research is the unusual framing. This may go down in history as the “it’s not our fault” study.

Facebook: It’s not our fault.

I carefully wrote the above based on my interpretation of the results. Now that I’ve got that off my chest, let me tell you about how the Facebook data science team interprets these results. To start, my assumption was that news polarization is bad.  But the end of the Facebook study says:

“we do not pass judgment on the normative value of cross-cutting exposure”

This is strange, because there is a wide consensus that exposure to diverse news sources is foundational to democracy. Scholarly research about social media has–almost universally–expressed concern about the dangers of increasing selectivity and polarization. But it may be that you do not want to say that polarization is bad when you have just found that your own product increases it. (Modestly.)

And the sources cited just after this quote sure do say that exposure to diverse news sources is important. But the Facebook authors write:

“though normative scholars often argue that exposure to a diverse ‘marketplace of ideas’ is key to a healthy democracy (25), a number of studies find that exposure to cross-cutting viewpoints is associated with lower levels of political participation (22, 26, 27).”

So the authors present reduced exposure to diverse news as a “could be good, could be bad” but that’s just not fair. It’s just “bad.” There is no gang of political scientists arguing against exposure to diverse news sources.**

The Facebook study says it is important because:

“our work suggests that individuals are exposed to more cross-cutting discourse in social media they would be under the digital reality envisioned by some

Why so defensive? If you look at what is cited here, this quote is saying that this study showed that Facebook is better than a speculative dystopian future.*** Yet the people referred to by this word “some” didn’t provide any sort of point estimates that were meant to allow specific comparisons. On the subject of comparisons, the study goes on to say that:

“we conclusively establish that…individual choices more than algorithms limit exposure to attitude-challenging content.”

compared to algorithmic ranking, individuals’ choices about what to consume had a stronger effect”

Alarm bells are ringing for me. The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking. Probably while they are in the coal mine. What I mean to say is that there is no scenario in which “user choices” vs. “the algorithm” can be traded off, because they happen together (Fig. 3 [top]). Users select from what the algorithm already filtered for them. It is a sequence.**** I think the proper statement about these two things is that they’re both bad — they both increase polarization and selectivity. As I said above, the algorithm appears to modestly increase the selectivity of users.

The only reason I can think of that the study is framed this way is as a kind of alibi. Facebook is saying: It’s not our fault! You do it too!

Are we the 4%?

In my summary at the top of this post, I wrote that the study was about people “who volunteer their ideological affiliation in their profile.” But the study also describes itself by saying:

“we utilize a large, comprehensive dataset from Facebook.”

“we examined how 10.1 million U.S. Facebook users interact”

These statements may be factually correct but I found them to be misleading. At first, I read this quickly and I took this to mean that out of the at least 200 million Americans who have used Facebook, the researchers selected a “large” sample that was representative of Facebook users, although this would not be representative of the US population. The “limitations” section discusses the demographics of “Facebook’s users,” as would be the normal thing to do if they were sampled. There is no information about the selection procedure in the article itself.

Instead, after reading down in the appendices, I realized that “comprehensive” refers to the survey research concept: “complete,” meaning that this was a non-probability, non-representative sample that included everyone on the Facebook platform. But out of hundreds of millions, we ended up with a study of 10.1m because users were excluded unless they met these four criteria:

  1. “18 or older”
  2. “log in at least 4/7 days per week”
  3. “have interacted with at least one link shared on Facebook that we classified as hard news”
  4. “self-report their ideological affiliation” in a way that was “interpretable”

That #4 is very significant. Who reports their ideological affiliation on their profile?

add your political views

It turns out that only 9% of Facebook users do that. Of those that report an affiliation, only 46% reported an affiliation in a way that was “interpretable.” That means this is a study about the 4% of Facebook users unusual enough to want to tell people their political affiliation on the profile page. That is a rare behavior.

More important than the frequency, though, is the fact that this selection procedure confounds the findings. We would expect that a small minority who publicly identifies an interpretable political orientation to be very likely to behave quite differently than the average person with respect to consuming ideological political news.  The research claims just don’t stand up against the selection procedure.

But the study is at pains to argue that (italics mine):

“we conclusively establish that on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content.”

The italicized portion is incorrect because the appendices explain that this is actually a study of a specific, unusual group of Facebook users. The study is designed in such a way that the selection for inclusion in the study is related to the results. (“Conclusively” therefore also feels out of place.)

Algorithmium: A Natural Element?

Last year there was a tremendous controversy about Facebook’s manipulation of the news feed for research. In the fracas it was revealed by one of the controversial study’s co-authors that based on the feedback received after the event, many people didn’t realize that the Facebook news feed was filtered at all. We also recently presented research with similar findings.

I mention this because when the study states it is about selection of content, who does the selection is important. There is no sense in this study that a user who chooses something is fundamentally different from the algorithm hiding something from them. While in fact the the filtering algorithm is driven by user choices (among other things), users don’t understand the relationship that their choices have to the outcome.

not sure if i hate facebook or everyone i know
In other words, the article’s strange comparison between “individual’s choices” and “the algorithm,” should be read as “things I choose to do” vs. the effect of “a process Facebook has designed without my knowledge or understanding.” Again, they can’t be compared in the way the article proposes because they aren’t equivalent.

I struggled with the framing of the article because the research talks about “the algorithm” as though it were an element of nature, or a naturally occurring process like convection or mitosis. There is also no sense that it changes over time or that it could be changed intentionally to support a different scenario.*****

Facebook is a private corporation with a terrible public relations problem. It is periodically rated one of the least popular companies in existence. It is currently facing serious government investigations into illegal practices in many countries, some of which stem from the manipulation of its news feed algorithm. In this context, I have to say that it doesn’t seem wise for these Facebook researchers to have spun these data so hard in this direction, which I would summarize as: the algorithm is less selective and less polarizing. Particularly when the research finding in their own study is actually that the Facebook algorithm is modestly more selective and more polarizing than living your life without it.

Update: (6pm Eastern)

Wow, if you think I was critical have a look at these. It turns out I am the moderate one.

Eszter Hargittai from Northwestern posted on Crooked Timber that we should “stop being mesmerized by large numbers and go back to taking the fundamentals of social science seriously.” And (my favorite): “I thought Science was a serious peer-reviewed publication.”

Nathan Jurgenson from Maryland and Snapchat wrote on Cyborgology (“in a fury“) that Facebook is intentionally “evading” its own role in the production of the news feed. “Facebook cannot take its own role in news seriously.” He accuses the authors of using the “Big-N trick” to intentionally distract from methodological shortcomings. He tweeted that “we need to discuss how very poor corporate big data research gets fast tracked into being published.”

Zeynep Tufekci from UNC wrote on Medium that “I cannot remember a worse apples to oranges comparison” and that the key take-away from the study is actually the ordering effects of the algorithm (which I did not address in this post). “Newsfeed placement is a profoundly powerful gatekeeper for click-through rates.”

Update: (5/10)

A comment helpfully pointed out that I used the wrong percentages in my fourth point when summarizing the piece. Fixed it, with changes marked.

Update: (5/15)

It’s now one week since the Science study. This post has now been cited/linked in The New York Times, Fortune, Time, Wired, Ars Technica, Fast Company, Engaget, and maybe even a few more. I am still getting emails. The conversation has fixated on the <4% sample, often saying something like: "So, Facebook said this was a study about cars, but it was actually only about blue cars.” That’s fine, but the other point in my post is about what is being claimed at all, no matter the sample.

I thought my “coal mine” metaphor about the algorithm would work but it has not always worked. So I’ve clamped my Webcam to my desk lamp and recorded a four-minute video to explain it again, this time with a drawing.******

If the coal mine metaphor failed me, what would be a better metaphor? I’m not sure. Suggestions?

 

 

Notes:

* Diversity in hard news, in their study, would be a self-identified liberal who receives a story from FoxNews.com, or a self-identified conservative who receives one from the HuffingtonPost.com, where the stories are about “national news, politics, [or] world affairs.” In more precise terms, for each user “cross-cutting content” was defined as stories that are more likely to be shared by partisans who do not have the same self-identified ideological affiliation that you do.

** I don’t want to make this even more nitpicky, so I’ll put this in a footnote. The paper’s citations to Mutz and Huckfeldt et al. to mean that “exposure to cross-cutting viewpoints is associated with lower levels of political participation” is just bizarre. I hope it is a typo. These authors don’t advocate against exposure to cross-cutting viewpoints.

*** Perhaps this could be a new Facebook motto used in advertising: “Facebook: Better than one speculative dystopian future!”

**** In fact, algorithm and user form a coupled system of at least two feedback loops. But that’s not helpful to measure “amount” in the way the study wants to, so I’ll just tuck it away down here.

***** Facebook is behind the algorithm but they are trying to peer-review research about it without disclosing how it works — which is a key part of the study. There is also no way to reproduce the research (or do a second study on a primary phenomenon under study, the algorithm) without access to the Facebook platform.

****** In this video, I intentionally conflate (1) the number of posts filtered and (2) the magnitude of the bias of the filtering. I did so because the difficulty with the comparison works the same way for both, and I was trying to make the example simpler. Thanks to Cedric Langbort for pointing out that “baseline error” is the clearest way of explaining this.

(This was cross-posted to multicast and Wired.)

85 thoughts on “The Facebook “It’s Not Our Fault” Study

  1. Pingback: Does Facebook contribute to political echo chamber? – USA TODAY | NewsBreakOnline.com

  2. Christian, perhaps longer comments later, but I want to reply to “The paper’s citations to Mutz and Huckfeldt et al. to mean that ‘exposure to cross-cutting viewpoints is associated with lower levels of political participation’ is just bizarre.” This is neither a typo nor bizarre; it is accurate.

    It turns out that exposure to (and consideration of) diverse ideas can make people less likely to participate in the political process through giving or voting. I don’t recall all of the proposed mechanisms, but two include: (a) acting doesn’t matter, both sides are fine or (b) it’s just too hard to decide.

    One common question on my own work about increasing exposure to diverse views has been “well, might this just make people apathetic?” I think it’s a possibility — it does seem that energizing people politically is at least sometimes at odds with presenting them with diverse, considered viewpoints.

    + citation(s) when I’m not on the bus.

    1. Christian Sandvig

      This footnote was meant to be about the citations. I wrote that I took “the paper’s citations to Mutz and Huckfeldt et al. to mean that ‘exposure to cross-cutting viewpoints is associated with lower levels of political participation’ [as] just bizarre.” Check out those articles, then tell me that you agree that they found that exposure to cross-cutting viewpoints is associated with lower levels of political participation. You won’t agree, I bet.

      1. Ah, sorry – I misunderstood your criticism in the footnote. There are hints of potential negatives of cross-cutting exposure in both works — Mutz acknowledges that increasing cross-cutting exposure may require tradeoffs with other values; Huckfeldt found less interest in the campaign but no effect on turnout — but, yeah, that is maybe not the sentence + citations I would have written.

        That said, I wouldn’t be so quick to criticize the authors for not taking a stance or at least acknowledging that some have questions about the value of cross-cutting exposure (the non-footnote part of this comment), since there’s always someone who raises the question “what if seeing diverse views is actually bad?”

      2. Christian Sandvig

        I dunno. This phrase in the Facebook study particularly got to me because when I read it quickly it seemed balanced, but in fact there isn’t anyone going around saying “let’s give people more news they agree with to be sure we don’t suppress voter turnout.” That’s what this sentence seems to be implying. And polarization is in fact seen as a huge problem, with the fingers pointed at social media (read: Facebook). So this attempt to review the literature seems pretty weird (and the citations are wrong).

        Let me double-down on my claim. I’ll bet a $1 bill that if we poll a representative group of political scientists about the state of the art, they won’t back an “increased filtering / decreased diversity” proposal as a good idea. In fact I’ll bet an additional $1 it’s unanimous (but I’m not as sure about that).

        That’s all I’m trying to say. I don’t think this is a controversial thing to say, but I could be wrong.

  3. Pingback: Does Facebook contribute to political echo chamber? – USA TODAY | Everyday News Update

  4. Pingback: Facebook study says it’s your fault, not theirs, if your feed is all like-minded friends | Fusion

  5. Pingback: multicast » Blog Archive » The Facebook “It’s Not Our Fault” Study

  6. Yotam Shmargad

    Regarding whether diverse exposure is good or bad, Flaxman, Goel, and Rao find that online networks both expose people to diverse perspectives and increase ideological polarization. My reading of this is that Facebook has an intimacy problem – it can’t provide the proper context for people to engage in proper discussion with the other side. A better understanding / control of who sees what you post / comment on might partially resolve this by providing a less public space for cross-cutting political *discussion* rather than mere exposure. https://5harad.com/papers/bubbles.pdf

    1. Christian Sandvig

      Thanks for the paper! I like the way the paper you shared describes the magnitude of the polarization effects (“modest”) as it is the same word I used to describe the magnitude of the effects found in this Facebook study. I also said this effects claimed in this FB study are unsurprising. Given the amount of scrutiny the Facebook data science team must be under when they publish something, I’m still surprised that they decided to write up these data in this way. We see a fair literature review in the first paragraph of the Facebook study, but then the way the findings were interpreted seems extremely strange (“we do it less than you!”).

      1. Yotam Shmargad

        It’s also kind of odd that the reviewers didn’t anticipate the framing issues, given the clearly high profile subject matter. At the same time blind review is sort of lost when the data basically identify the research team. I’ll be at the ICWSM workshop.. hope we can talk more then!

  7. Pingback: New Facebook Study Says the Echo Chamber Is Your Fault | Nagg

  8. Pingback: New Facebook Study Says the Echo Chamber Is Your Fault – Wired | Everyday News Update

  9. Pingback: Facebook's algorithm study doesn't prove what the network says it does - Fortune

  10. Pingback: Facebook: If your News Feed is an echo chamber, you need more friends - Ask a Question and Get Answer Frequently Asked Questions

  11. Pingback: Facebook: If your News Feed is an echo chamber, you need more friends | Taiwan NO 01

  12. The real problem of this study lies somewhere else entirely. Exposure to “hard news” on Facebook is fundamentally different to other kinds of exposure (say, news on TV). Even though you are exposed to “cross-cutting” at a rate that does not look really dramatic in numbers, the context of this exposure is left out completely. In my experience, most of the cross-cutting hard news are essentially a biased experience. Even if these links might appear to an analyst of the dataset as representing a different opinion, ways to educate yourself about other views, they do more often than not function like a public pillory, a reinforcement of the “filter bubble”. A look at some of the more extreme communities on facebook can easily reveal a widespread political opinion if you just look at the news links posted, but the people of that community might still be a raging hatemob of the worst kind.

    1. Christian Sandvig

      I noticed Diana Mutz made a related point in the New York Times coverage of this study:

      “People who are really into politics expose themselves to everything,” said Diana C. Mutz, a political scientist at the University of Pennsylvania. “So they will expose to the other side, but it could be to make fun of it, or to know what they’re saying to better argue against it, or just to yell at the television set.”

  13. Pingback: Facebook Echo Chamber Isn?t Facebook?s Fault, Says Facebook | Sports News

  14. Pingback: Facebook Echo Chamber Isn’t Facebook’s Fault, Says Facebook – Wired | Everyday News Update

  15. Pingback: Facebook Echo Chamber Isn’t Facebook’s Fault, Says Facebook – Wired | NewsBreakOnline.com

  16. Pingback: Facebook Echo Chamber Isn’t Facebook’s Fault, Says Facebook – Wired | Superior News Update

  17. It is also deeply flawed in the sense that people use Facebook to keep up to date with their friends and have no interest in any “news” it wants to share (I and many others use AdBlock to remove their news feature anyway.)

    1. Christian Sandvig

      A clarification: I think the “hard news” in the study refers to links shared by friends, not to any other news-related features on the site (“Related Stories” etc.).

  18. Pingback: Don’t (just) blame Facebook: We build our own bubbles – Ars Technica | Everyday News Update

  19. Pingback: Has Facebook finally popped its filter bubble? Not quite

  20. diane

    I’m curious as to what Grant Funding may have been used. Interesting that it’s not noted.

    The Facebook/DC revolving door has been disturbing from Facebook’s inception as a “publically traded” company; as is it’s Thiel/Palantir cousin’s Defense Department relationship.

      1. diane

        Whoa, care to go into any reasoning why I would be a Troll (whatever that widespread sickening write off of other human beings, many of whom have been utterly battered by our current “Techno Utopia,” actually means anymore since it’s been used so stunningly abusively on the web) when I didn’t fling anything that could even slightly be considered an insult at the author, but validly questioned whether the study was funded, at least in part, by government grants? As someone with cancer I frequently access research papers and there are, almost always, references to funding, when there are not, I find it vaguely disturbing (Shoot Me, for that?????).

        Are you upset that I noted the revolving DC/Facebook door? There certainly is, and has been a blatant Facebook/DC Revolving Door. For just one very recent example:

        03/19/15 Obama names top Facebook engineer director [David Recordon , with the “I love Robots” shirt. And haven’t billions of Human non techies suffered enough yet from Bot Love?] of White House IT, creates Presidential IT Committee http : // e-pluribusunum.org / 2015 / 03 / 19 / obama-names-top-facebook-engineer-director-of-white-house-it-creates-presidential-it-committee / (delete blank spaces to link to the piece, I didn’t want the comment to get snagged in moderation)

        And, oh my just used your link to see how I could have possibly irritated you so personally, that you’d fling that Troll word at me. So, you’re in the University of Pennsylvania’s Department of Pyschology Department and indiscriminately flinging out that troll word write off of another? Then again the APA (American Psychologist’s Association), certainly has a rather stained record of late with that Guantanamo (et al), torture involvement, so perhaps one should not rely on someone studying psychology to also have a capacity to treat another human with anything approaching kindness; as a majority understand kindness when they are the recipient of it.

        Lastly, since things are being wildly flung (to see if they stick?), I do feel I have now acquired the right to comment that your photo, at your linked page, exactly matches up with your stunningly smug and arrogant insult.

      2. diane

        Addendum to my last, above, comment:

        By the way, Mister Bishop, speaking of Government Grants, thank you – what with your Human Psychology specialty, and all – for reminding me of the following heartfelt commentary re Government Grants and the DOD ‘Minerva Research Initiative’ (again, delete blank spaces from url to link to it, bolding mine):

        06/12/14 Pentagon preparing for mass civil breakdown – Social science is being militarized to develop ‘operational tools’ to target peaceful activists and protest movements .[ http : // www . theguardian . com /environment /earth-insight/2014/jun/12/pentagon –mass –civil –breakdown ]

        (Oh, and I am all for a Government for and RUN BY The People, ALL OF THEM.)

      3. diane

        (Oh and re what your user name, Mr. Bishop, links to (outside of that horridly smug photo) The Good Judgment Project, oh please, good judgment according to who, … you (none of those billions unable to access the web has any say in that? …Despite that those stunningly poverty ridden, sans “The Web [Inter Net]” humans are every bit as wise as you, and likely, far more?)? …despite the fact that your photo, combined with your PROCLAIMED high and mighty occupation (The Good Judgment Project), gives out a quite clear symbol that you’re rather off the hook (that’s being polite, on my part), and way, ….way, .waaayyyyyyyyy…. too impressed with yourself as such a MONETARILY FAVORED and used (pawned) Spokes Person as a consequence of your: young age; gender and paleness.)

      1. diane

        Well, when I was much younger, I would have gladly accepted your ‘apology,’ but, by this point in my life I’ve come to realize that apologies should come directly after the damage and hurt is done. They aren’t really apologies when they come only after the victim musters up the nerve to defend themselves. It was clear, given the 27 hour timing between your slur and my response, you would not have ‘apologized’ if I hadn’t called you on your slur.

        Some victims kill themselves (i.e. facebook, etcetera, bullying) instead of doing ‘battle’ against their On Line name callers. Words can be deadly to those they’re directed towards, something I used to expect Psych Majors to be the first to understand. Do keep that in mind while working on your Government Funded, DARPA/CIA associated, Good Judgement Project.

  21. Pingback: Marketing Day: How Best To Organize Your Site, YouTube's Top 10 Ads In April & Email Marketing Tips

  22. I think the OP misread one aspect of the paper. “Finally, the research then showed that “individuals’ choices about what to consume” further limits their “exposure to cross-cutting content.” Conservatives will click on only 17% of cross-cutting hard news, while liberals will click 7%.”

    After reading the paper and looking at the figures I think the correct claim to attach to those numbers is that conditional on being exposed to a given set of news stories, conservatives (liberals) click on 17% (7%) fewer of the cross-cutting items than the non-cross-cutting ones.

    1. Christian Sandvig

      Crud. OK, what I was trying to do in that bullet point was to speak in percentages for a general audience about the effect of individual selection while removing the “less than” comparison to the algorithm because (as I say later in the blog post) I find it problematic. So I definitely quoted percentages from the wrong place. I think the percentages I am after are in the right-most column of Fig. 3. I’ve updated that. Thanks so much for noticing this.

  23. Pingback: What Facebook’s ‘It’s Not Our Fault’ Study Really Means – Wired | Everyday News Update

  24. Pingback: Why Scientists Are Upset About The Facebook Filter Bubble Study – Fast Company | Everyday News Update

  25. Pingback: Facebook Study Doesn’t Silence Filter-Bubble Criticism

  26. Pingback: Facebook: No estamos para la política responisible Sala de Prensa RSS Echo - LocurasGeek - Tecnologia e Internet y Juegos

  27. Pingback: Why Scientists Are Upset About The Facebook Filter Bubble Study | technology market

  28. Pingback: What Facebook’s ‘It’s Not Our Fault’ Study Really Means – Wired | NewsBreakOnline.com

  29. Pingback: What Facebook’s ‘It’s Not Our Fault’ Study Really Means – Wired | Superior News Update

  30. Pingback: Facebook thinks you’re too dumb to realize its scientific papers are really just PR | PandoDaily

  31. Pingback: Newsletter Farol Jornalismo #44 Mais do State of the News Media, críticas ao Facebook e boas e más notícias sobre startups e novos formatos | Farol Jornalismo

  32. Pingback: What Can We Learn from 10.1 Million Facebook Users? “It’s Complicated.” | The New West

  33. Pingback: Így rohasztja szét a Facebook szép lassan a demokráciát | OPIX

  34. Pingback: How Facebook’s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks (The Message) | Uma (in)certa antropologia

  35. Pingback: Un estudio de Facebook concluye que tú filtras las noticias más que su propio algoritmo | el BLOG de FCASTROG

  36. Pingback: Facebook Research Critiques - event mechanics

  37. Pingback: #Briefing: Alles zu Facebooks Filter-Bubble-Studie

  38. Pingback: 「怪我囉!」臉書研究撇清演算法造成言論瀘泡挨轟 | TechNews 科技新報

  39. Pingback: The political echo chamber on your Facebook news feed is not Facebook’s fault, Facebook researchers say – Quartz

  40. Pingback: “怪我啰!”FaceBook研究撇清演算法造成言论滤泡挨轰 | TechNews 科技新报

  41. Pingback: QUI CHOISIT RÉELLEMENT, FACEBOOK OU VOUS ? |

  42. Pingback: Facebook: Fair and Balanced » Cyborgology | James Reads

  43. Pingback: The Facebook “It’s Not Our Fault” Study | Social Media Collective | James Reads

  44. Pingback: Kampania wyborcza a kształtowanie opinii 2.0

  45. diane

    I regret, in retrospect of my first comment posting (above), that I did not include, the rest of the Sly Con Valley/Pacific Northwest” (Amazon/MicroSoft) ‘Tech’ Oligarchies (MicroSoft, Oracle, Google, Amazon, eBay, Tesla, Twitter, Uber, et al, and those OLDER, now much quieter deeders from the EAST COAST, such as MIT (Thoroughly Militarized Massachusetts Institute of Technology), and CMU/AI (Pittsburgh, Pennsylvania’s Thoroughly Militarized Carnegie Mellon/Artificial Intelligence Branch (why would anyone in their right mind desire ARTIFICIAL intelligence??????) , as to those swinging, Fascistly Corporatized, DC/Pentagon Doors.

    1. diane

      (oh woopsie, and how could I possibly forget Stanford University (and that ugly DARPA FUNDED birthing of that Above any Moral Law of PRIVACY, Google Oligarchy, for just one horrid, recent example), in that equation.)

    2. diane

      oh my, speaking of MIT ( (Thoroughly Militarized Massachusetts Institute of Technology), who knew?????? that there has been a recent “rash” of suicides, among the ‘fresh recruits’ there:

      05/14/15 4:52 PM ET After Suicides, MIT Works To Relieve Student Pressure (it might have been helpful had NPR also noted that those suicides were not at all made common knowledge; but, then again: NPR [National Public Radio [not] ™]™ only seems to want to comment when it becomes impossible to hide things anymore behind a closed curtain:

      The event comes at a difficult time for MIT. Six students have committed suicide in the last 14 months. And MIT’s suicide rate surpassed the national average both last year and this year. …

      ….

      1. diane

        addendum to (yes, all of us (even when we never could, or, cannot barely afford to stay “On Line”) are ‘researchers’) my last comment, directly above:

        …. It might have been helpful had NPR also noted that those suicides were not at all made common knowledge; but, then again: NPR [National Public Radio [not] ™]™ only seems to want to comment when it becomes impossible to hide things anymore behind a closed curtain

        even worse, when NPR™ finally does admit there is a stunningly gut wrenching issue at hand, they still want to sugar coat (fake and deadly ‘sweet’: faux glucose prescription) it:

        “There’s actually no empirical evidence at this point that schools that are more competitive or more pressured actually have higher rates of suicide deaths than other colleges,” says Victor Schwartz, medical director of the JED Foundation, which helps colleges improve their suicide prevention programming.

        “With undergraduates, the information we have suggests more that suicidal behavior is more often associated with relationship or family problems,” Schwartz says.

        Among the MIT students who most recently committed suicide, one had a disease that caused debilitating chronic pain, according to published reports. Another had sought help from an MIT psychiatrist for troubling thoughts about death he had never revealed to his parents, and another was devastated by her mother’s sudden death, their families tell us.

        talk about a straw human[s] being held responsible (it’s the student and their parents, not the MIT Culture, responsible for MIT’s rash of suicides? ; and, truly, [Capitalized/Free Market!] Empirical Evidence™ Required?????????? to acknowledge what is indisputably happening all over this planet earth?addendum to (yes, all of us (even when we cannot barely afford to stay “On Line”) are ‘researchers’) my last comment, directly above:

        …. It might have been helpful had NPR also noted that those suicides were not at all made common knowledge; but, then again: NPR [National Public Radio [not] ™]™ only seems to want to comment when it becomes impossible to hide things anymore behind a closed curtain

        even worse, when NPR™ finally does admit there is a stunningly gut wrenching issue at hand, they still want to sugar coat (fake and deadly ‘sweet’: faux glucose prescription) it:

        “There’s actually no empirical evidence at this point that schools that are more competitive or more pressured actually have higher rates of suicide deaths than other colleges,” says Victor Schwartz, medical director of the JED Foundation, which helps colleges improve their suicide prevention programming.

        “With undergraduates, the information we have suggests more that suicidal behavior is more often associated with relationship or family problems,” Schwartz says.

        Among the MIT students who most recently committed suicide, one had a disease that caused debilitating chronic pain, according to published reports. Another had sought help from an MIT psychiatrist for troubling thoughts about death he had never revealed to his parents, and another was devastated by her mother’s sudden death, their families tell us.

        talk about a straw human[s] being held responsible (it’s the student and their parents, not the MIT Culture, responsible for MIT’s rash of suicides? ; and, truly, [Capitalized/Free Market!] Empirical Evidence™ Required?????????? to acknowledge what is indisputably happening all over this planet earth?

  46. Pingback: Skeptický podcast - Pseudocast #190 - Facebook, test aplikácie What's Up?

  47. Pingback: What Can Feminism Teach Facebook Researchers? A Science Studies Primer » Cyborgology | James Reads

  48. Pingback: Critics Slam Facebook's 'Filter Bubble' Study - Tsepa

  49. Pingback: Nee tegen nep-internet van Facebook « Bits of Freedom

  50. Pingback: ¿Qué tienen en común Taylor Swift, el Algoritmo de Facebook y el Feminismo? | Conversación Breve

  51. Pingback: Quick links (#32) | Urban Future (2.1)

  52. Pingback: Somessa.fi | Viikon valinnat #3: Facebookin algoritmit polarisoivat, internet ei lopu koskaan.

  53. Pingback: Facebook: Nem a mi hibánk – tanulmány | Média Műhely

  54. Pingback: When It Comes to Content Marketing, You Should Absolutely Double Down on Facebook - The Dessauer GroupThe Dessauer Group

  55. Pingback: Why Scientists Are Upset About The Facebook Filter Bubble Study - Webnesday

  56. Pingback: Facebook: If your feed is an echo chamber, you need more friends | Super Deal Shopper

  57. Pingback: Does Facebook Make the News? | Synapseapp Blog

  58. Pingback: Facebook Makes the News | Synapseapp Blog

  59. Pingback: The Refugee Crisis through Snapchat and Periscope | Digital Disruption

  60. Pingback: Facebook v. Science | Gateway Journalism Review

  61. Pingback: Facebook-Filterblase? Es liegt nicht nur am Algorithmus - Digital - Das Pressebüro

  62. Pingback: Trustworthy Networking - KAYENRE Technology

  63. Pingback: Sheryl Sandberg in an interview with Axios said Facebook is less of a filter bubble than traditional news outlets — Quartz

  64. Pingback: Die öffentliche Meinungsbildung wird für Facebook zum Experimentierfeld – netzpolitik.org - Das Pressebüro

  65. Pingback: Does more local news mean more cats, all the time? – Young People & the Future of News

  66. Pingback: Kyllä, algoritmit uhkaavat demokratiaa – Valta luisuu some-miljardööreille – Uutislehti

Comments are closed.