Everyone can see this coming ….
TheRegister frames it as a “consent” problem. To me it is not a consent problem, however, having explicit consent would had help. The issue is the way the data is made available to the researcher. Did they got anonymized data? and more importantly, how is it anonymized?
I think anonymized data was provided, so the question how was it anonymized is the more important question. It does not matter if the research was conducted on Facebook’s computers or the researchers’ computer. It is the question of what the researchers see. Not being well versed in law, I am not the question of what the researchers see extends to their computer program. In these days where computers are shifting through data, I believe this should matter.
Anonymizing the data opens a big gigantic can of worms. The research shift through so much data and inevitably personal details slip through. It is difficult to completely stop non-anonymized data going through. For example, if I say “ctrambler is an idiot” then simply because ctrambler is my handle on the blog and it points to a person, i.e., me, the data is not anonymized. Recognizing the impossibility of total complete anonymization, one simply has to demonstrate one had taken the utmost care to minimize as much as humanly possible. Hoever, do you want to stand in front of the Data Commissioner trying to convince him you tried your best? I will avoid it if I can.
Having explicit consent will alleviate this concern a lot. One can write the consent form telling participants that it is inevitable that some snippet of data will slip through. Most people recognize this and will be fine with it. Without explicit consent, the Data Commissioner adamantly will not be fine with it.
Forbes managed to dig out that may be the blanket consent Facebook is relying on to justify itself was after the research was conducted. Oops, big foot in one’s mouth. Details like this matters in the law courts, but not to me. My bigger beef is Facebook says that what they did is in accordance to their data use policy which permits the use of data in “internal operations” including “research”. I don’t think joe public’s definition of “research” covers being manipulated. My definition, which I thinks matches joe public’s, is research is limited to the data already present on Facebook, not something FaceBook try to collect on top of it, i.e., my response to manipulated news feed. Neither do I see this as a legitimate “internal operations”. I see this as public experiment that will need explicit informed consent unless a reputable ethics committee says otherwise.