AoIR6: Ethics Panel
Here're the notes from the Ethics panel, convened by AoIR folks who were on the Ethics working group
Charles Ess
See Peden and Flashinsky (2004) - Ethical research decisions: A content analysis.
Chris Mann (UK) 2003: biog from here
Chris Mann is a Senior Research Associate in the Faculty of Social and Political Sciences at the University of Cambridge, having completed her Ph.D. at Cambridge in 1996. Her pioneering work on methodological and ethical issues in doing Internet-research underpinned her research at the OII. Her book, with Fiona Stewart, Using the Internet in Qualitative Research (London: Sage, 2000) has been adopted for many courses on new research methods
Technical
Legal
Personal
Possible consent options
See Ethics and information technology 7(1) – china, japan and Thailand – forthcoming
Annette N Markham (University of the Virgin Islands)
Methods and ethics
Method is a series of decision points:
Method of constructing the q and laying out general design of the study
Method of accessing the participants (data) and defining field boundaries
Method of collecting info
Method of filtering and organising info
Method of analysing data into general themes
Method of interpreting general themes
Method of representing self, other and the phenomenon in writing
CMC and internet research has exposed a lot of limitations in Internet Research Board process
Mark Johns
Risks to reputation
Sexual orientation, other social stigmas outted on online
Identification
Risks to rep of person and PERSONA (and business eventually)
(“tell her you’re a rottwieller!” New Yorker cartoon mark 2)
Economic risks
Job loss because of what people said in a blog/online chat etc
[What do we do with vulnerable populations – doesn’t just mean kids etc – newbies who don’t understand the dimensions of internet and its privacy issues); data retention – eg. Blogs. You’ve deleted but you have info mined that you don’t have any right to – we don’t have the right to retain data; archives]
Invasion and Intrusion risks.
Sometimes ignorance, but sometimes expectations vary widely from what our expectation of it will be. E.g., blog = private diary. Unless invited shouldn’t be looking at it. As researchers, we expect that anything out there is for mining. Nyt wants us to look at it, yet it’s password-protected. Password-protection doesn’t mean it's a closed forum. Expectations need to be taken into consideration.
Jeremy Hunsinger
Maastricht – Public Perception of AoIR Online. Check out this presentation.
Keep all personal information in separate file from content
Be wary of survey monkeys etc 'cause their data encryption/protection isn’t up to the same degree required by IRBs. Instead, create a protocol that stays within the uni firewalls.
A system can learn how you type and write, so therefore it’s pretty easy to find out who you are. Same with photography.
Password protection isn’t good enough.
Don’t copy your data to the hard drive. Work floppy. Take data offline if you don’t need it there.
David Brin – Transparent Society (issues of privacy) (but Ess says it’s an Amerocentric conception of privacy)
Lori Kendall (she who wrote Hanging out in the Virtual Pub)
Social Problems (1980) (informed consent in field research. Similar issues to internet research)