- case study 2: data mining

4
Case Study #2: Data Mining and Patient Privacy The following is an excerpt from a September 30, 2009 article on Wired.com. (For the full article, see http://www.wired.com/wiredscience/2009/09/domestic-abuse-prediction/ .) Please read the excerpt and, in your small groups, discuss the questions on the reverse. ************ Data-Mining Medical Records Could Predict Domestic Violence By Frederik Joelving To a busy emergency physician, a split lip or a case of poisoning is just one of those things they deal with. But to a computer mining the patient’s medical history, it could be the last diagnosis needed to decipher a pattern of domestic violence. Now, a group of researchers at Harvard University has created the first computer model to automatically detect the risk that a patient is being abused at home. The results were published Sept. 29 in the British Medical Journal. “It’s a great concept,” said Debra Houry, an emergency physician at Emory University, who was not involved in the research. Although around one in four women experience domestic violence at some point in their lives, she says, the problem often goes unnoticed at a doctor’s visit. “It’s one of those hidden epidemics where they don’t come up to you and disclose the issue.” In fact, patients often try to hide the abuse, says Ben Reis, a Harvard pediatrician and computer scientist who designed the new computer model. “Abused people actually go to different emergency rooms each time, so that [the abuse] is harder to track.” To get around this problem, Reis and his colleagues tapped into a public U.S. database containing six years of medical history for around half a million people. They fed a large portion of the database into a simple computer model — known as a naïve Bayesian classifier — which then calculated the abuse risks linked to different diagnoses such as burns, sprains or mental disorders. 1 Health Law Case Study #2 – October 1, 2009

Upload: tommy96

Post on 12-Apr-2017

422 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: - Case Study 2: Data Mining

Case Study #2: Data Mining and Patient Privacy

The following is an excerpt from a September 30, 2009 article on Wired.com. (For the full article, see http://www.wired.com/wiredscience/2009/09/domestic-abuse-prediction/.) Please read the excerpt and, in your small groups, discuss the questions on the reverse.

************

Data-Mining Medical Records Could Predict Domestic Violence By Frederik Joelving

To a busy emergency physician, a split lip or a case of poisoning is just one of those things they deal with. But to a computer mining the patient’s medical history, it could be the last diagnosis needed to decipher a pattern of domestic violence.

Now, a group of researchers at Harvard University has created the first computer model to automatically detect the risk that a patient is being abused at home. The results were published Sept. 29 in the British Medical Journal.

“It’s a great concept,” said Debra Houry, an emergency physician at Emory University, who was not involved in the research. Although around one in four women experience domestic violence at some point in their lives, she says, the problem often goes unnoticed at a doctor’s visit. “It’s one of those hidden epidemics where they don’t come up to you and disclose the issue.”

In fact, patients often try to hide the abuse, says Ben Reis, a Harvard pediatrician and computer scientist who designed the new computer model. “Abused people actually go to different emergency rooms each time, so that [the abuse] is harder to track.”

To get around this problem, Reis and his colleagues tapped into a public U.S. database containing six years of medical history for around half a million people. They fed a large portion of the database into a simple computer model — known as a naïve Bayesian classifier — which then calculated the abuse risks linked to different diagnoses such as burns, sprains or mental disorders.

At present, medical records, even electronic ones, may be hard to interpret in the limited time a physician has to see a patient. “It’s usually a big, long list,” Reis says. “We reduced the entire history to one picture.”

That picture is called a risk gel. In essence, it shows the patient’s medical history as a bunch of colored bars representing diagnoses made at various visits. A green bar means the diagnosis is not statistically linked to abuse, while a red bar means it is. When the computer determines that the combined abuse risk based on all diagnoses is high, it sounds the alarm, letting the physician know that a face-to-face meeting is called for. “We see this system as a screening support system,” Reis said.

... researchers agree that domestic violence is severely underdetected by health care providers. But it shows up clearly in murder stats. According to the Harvard group, domestic abuse accounts for more than half of the murders of women in the United States. And without detection, there is no chance of helping the victims.

... Within four years, the group hopes to have a full-fledged system ready, including a user interface optimized for doctors. “The long-term vision is one of predictive medicine, where vast amounts of information are used to improve health care, diagnosis, screening and outcomes,” says Reis.

1 Health Law Case Study #2 – October 1, 2009

Page 2: - Case Study 2: Data Mining

Yet the question remains how to translate a diagnosis into action that will help the victims of abuse. “Identifying in itself is not enough,” says Houry. “But I believe it helps.”

Questions

1. Ben Reis refers to a future “where vast amounts of information” are collected about patients to improve their care. Should informed consent of the patient be required before these “vast amounts” of information are linked and analysed to identify undisclosed issues such as domestic violence? What are arguments for and against a consent requirement? How would you address concerns about the patient’s privacy? Do other interests override the patient’s privacy in this situation? What are those competing interests?

2. If a domestic violence data analysis system is introduced in a hospital and is shown to have a high degree of accuracy in identifying abusive situations, would disclosure of such information to law enforcement officials or social workers be justified? Should such disclosure require consent of the patient? What if the patient is a pregnant women or a woman who has recently given birth? Does it make a difference if you know that an infant is in a home that likely involves abuse?

3. Over the past several years, Canadian provinces (e.g. Alberta, Saskatchewan, Manitoba, Ontario, Nova Scotia) have enacted legislation that compels hospitals to report gunshot and/or stab wounds to police. The preamble to Ontario’s Mandatory Gunshot Wounds Reporting Act, 2005 states: “The people of Ontario recognize that gunfire poses serious risks to public safety and that mandatory reporting of gunshot wounds will enable police to take immediate steps to prevent further violence, injury or death.” The statute requires: “Every facility that treats a person for a gunshot wound shall disclose to the local municipal or regional police force or the local Ontario Provincial Police detachment the fact that a person is being treated for a gunshot wound, the person’s name, if known, and the name and location of the facility.” [see s. 2(1)] Should such mandatory reporting legislation be extended to situations where a computer model suggests to health care providers that domestic violence is occurring?

If you are interested in further reading on these topics, see:

Wayne Renke, “The Constitutionality of Mandatory Reporting of Gunshot Wounds Legislation” (2005) 14 Health Law Review 3. http://www.law.ualberta.ca/centres/hli/userfiles/1_Renke.pdf

Michael E. Rodriguez et al., “Mandatory Reporting of Domestic Violence Injuries to Police: What do Emergency Department Patients Think?” (2001) 286 Journal of the American Medical Association 580. http://jama.ama-assn.org/cgi/content/full/286/5/580

2 Health Law Case Study #2 – October 1, 2009