pp.74-91 in Bioethics for the People by the People, Darryl R. J. Macer, Ph.D. Eubios Ethics Institute 1994.

Copyright 1994, Darryl R. J. Macer. All commercial rights reserved. This publication may be reproduced for limited educational or academic use, however please enquire with Eubios Ethics Institute.

Scientific ethics: Workshops on science and ethics in New Zealand

Darryl Macer, Director, Eubios Ethics Institute

Howard Bezar, President, New Zealand Institute of Agricultural Scientists (NZIAS)

D.Gareth Jones, Professor of Anatomy & Structural Biology, University of Otago

Alan Kirton, President, New Zealand Association of Scientists (NZAS)

Barbara Nicholas, Lecturer, Bioethics Ethics Centre, University of Otago


To draft Royal Society of New Zealand Ethics Code (Developed after the series of conferences described here, but independently of the authors of this paper.
Introduction

Scientists are becoming aware of the variety of ethical issues, most of which have existed in the past but have been ignored. In August 1993 the NZIAS organised four workshops on the theme "Science and Ethics" in different cities in New Zealand. This report covers the material presented in the workshops and includes a variety of the comments and discussion points raised by the participants. The general program of these workshops followed an introductory session on ethics, a series of papers on science in the community, a session on ownership of scientific information, and fraud, then topics in biotechnology and animal experiments, concluded by looking at decision-making. Throughout the day a series of case studies were considered in smaller groups. Some pertinent questions and case studies are included at the end of each section. The participants could also return their comments in the form of a questionnaire after the workshop, and some comments are included here. We hope that the report from these meetings will be of assistance to scientists in their daily activities and to those planning future workshops in the future.


1. Ethics and Morality

Both words mean similar things (based on words for 'custom' in Greek and Latin respectively), but moral philosophers use them in different ways. Ethics is used to refer to the critical study of morals or morality, the latter being the specific values and behaviour of individuals or groups. Ethics itself does not promote a particular viewpoint, but is concerned with looking at the assumptions behind differing moral choices, seeking to clarify the arguments and the concepts that are used when people justify their moral views. Ethics is a branch of philosophy, the discipline that is devoted to clarity and logical coherence.

Within any professional group, there are certain concepts and theories to be mastered and intellectual skills learned that can be useful in difficult decision-making. This is particularly well-known in medicine. Doctors and other health professionals need to learn to ask why they make particular kinds of decisions and how they justify their decisions in the light of opposing views of what is right and wrong. There are also ethics that apply to science as a profession.

Another distinction is the contrast between ethics and ethos. Everybody is brought up within a particular ethos - particular attitudes and approaches to issues, ways of thinking, acceptance of the general ways in which things are done. This is clearly seen in medical education, where questioning young students quickly adapt to what they see as the accepted ways of doing things; their questioning disappears as they are converted into respectable members of the medical profession. What is happening is that they are adapting to the medical ethos. This has nothing to do with ethics, which starts from the refusal to accept that the ethos of ones group or society is always the best guide for moral choices. Ethics starts from the credo of Socrates: "The unexamined life is not worth living".

Etiquette is another non-ethical domain, that has considerable prominence within medicine, although has criticised in recent years. It covers obvious ways in which one does or does not treat one's patients. Etiquette comes into any profession, for example, the NZIAS 'Code of Ethics' deals largely with etiquette. It includes professional behaviour considered appropriate in dealing with clients, and the public, and colleagues.

It is also necessary to distinguish between ethical and legal. These are often confused, on the assumption that what is legal is ethical, but the question of what is ethical or unethical conduct is far wider than what is legal or illegal. It is quite possible to act in perfectly legal ways, and yet treat others in unethical ways, e.g., treating them as mere means to one's own ends.

Another distinction is made between ethics and indoctrination. This distinction is often obscured by people who want to deride the whole subject of ethics. They say ethics is just your prejudice; so teaching ethics is indoctrinating students in your own point of view, "I'm happy with my prejudices, thank you, and they are as good as yours." End of conversation. Simplistic and totally erroneous, but it has proved fertile ground in many areas. It shows a lack of comprehension of the distinction between ethics and morals, and it also assumes that people with well-formulated ethical positions wish to foist their views on everyone else. The better-thought out one is, the less one wants to foist any particular position onto others.

Ethics is not simply putting forward one's own views; neither is it an entirely relative matter. People brought up in the hard sciences like to think in terms of hard data that lend themselves to different interpretations, and even to right and wrong positions. We know that much of science isn't this simple, and yet there are elements of this in science. It's very easy then to dismiss ethics as though it doesn't matter; after all, my view is as good as yours, and if an ethics discussion doesn't end up with a clear right/wrong answer, why bother? This simply demonstrates the narrow vision of far too many brought up in the sciences. There are no neat answers to many things in life, but that doesn't negate their value or the fact that overall perspectives can be changed, even if it may take time. There are no neat answers to many issues in the humanities, philosophy, politics, aesthetics or religion, but everything in these domains is not thereby converted into some meaningless wasteland of relativity.

This view totally fails to appreciate how one copes with conflicting stances. Some may think it is perfectly ethical to acquire part of someone else's experimental data and publish them as their own (while those who had the data or ideas may, quite naturally, not agree). One soon begins to see that the subject of ethics isn't entirely arbitrary (e.g. treatment of human remains). There are shared values within the scientific community, of which honesty is one.

How do scientists relate to each other as professionals and as people who have different personal values? How do we relate to society, to views within society, especially when those views may be at variance with those widely accepted within the scientific community? How do we relate to politicians and the perceptions of science and scientists commonly held by others?

Our ethical systems have to be able to cope with a pluralist society. We may think of this simply in terms of our own personal values that may be at variance with many views held within society. That is true, and yet science also has to cope with pluralism. However, not everyone accepts the importance of science and scientific processes as much as we may wish. Coping with these clashes of perspective is aided through an understanding of ethical principles and the ways we live with others within a common society.

Neither is ethics simply a matter of asking questions. Important as this is, ethics is devoted to attempting to answer appropriate questions. Scientific ethics encompass aspects of bioethics and also of professional ethics. Some issues of specific concern to scientists include the value of biological/human life; scientific fraud; responsibility for the uses made of their work. Scientists are not neutral but are moral agents. Bioethics is not theoretical, it is intensely practical.

Bioethics is interdisciplinary. While philosophers can contribute much to ethical discussion, they most certainly do not have all wisdom in this realm. For any successful outcome in bioethics or related disciplines, there has to be input from scientists, lawyers, theologians, policy analysts, economists, administrators, and interested lay people, as well as philosophers. Together with all people, as this book is titled, we should decide how to organize our lives within society.

We often hear that we should do no further scientific work on a particular problem until we have sorted out the ethical issues. This assumes we know what the ethical principles are, quite apart from knowing the details of the science itself. It's as if there is some infallible book of ethical principles, and if we look up the right pages we shall end up with the principles of relevance even in totally new questions (e.g. doing good, not doing harm, informed consent, justice, confidentiality). However, we frequently do not know what the productive ethical questions are until some scientific possibility has been seen and probably accomplished. Therefore, scientific developments may often have to occur before we know how best to tackle the issue ethically. This emphasizes how interdisciplinary the whole venture is; it is a constant two-way dialogue between ethicists and scientists opening up new vistas for thought, debate and discussion. Therefore scientists must be informed about ethics, and ethicists require a very good grasp of scientific details and ways - of - thinking.

We need to change the philosophy of science to become more ethical. Under Baconian philosophy, the long-term aim of inquiry is to contribute to human progress, but the immediate aim is to produce objective knowledge, together with explanations and understanding. The idea that the philosophy of science should be based on the pursuit of benefit rather than the pursuit of knowledge is becoming accepted by many (Rees, 1993), as discussed earlier in this book. The philosophy of knowledge claimed that science must dissociate itself from the goals and values of common social life, so that claims to objective knowledge can be subjected to rational assessment. This is inconsistent with bioethical decision-making (Macer, 1994).

People make claims that science is ethically neutral. This implies that scientists do not have responsibility for the production of knowledge. However, this belief confuses the findings of science, which are ethically neutral, with the activity of science, which is not. Some pursue the neutrality argument, by claiming that the moral burden lies with those who choose to implement knowledge for all purposes. We may not be able to predict the abuses of pure knowledge, however, scientists are still moral agents and must think in advance of the possible abuses. They may not be solely responsible, but they share responsibility with all of us. All human activity needs to be subject to ethical discretion. Technology has been the most powerful agent of change in the recent past, therefore, we can clearly see the need for ethical maturity, and understanding.


2. Science, fraud and ethics

2.1 Misconduct in science and publication

Thirty years ago science was held in very high esteem. One frequently heard science held up as the epitome of all virtues. It was objective and it was truthful. It was the way in which one could discover the world as it really is. This vision depended on the truthfulness of the scientists themselves. Whether they cheated or not is another matter, but the perception was that cheating was rare within science. However, scientific fraud is now being uncovered with alarming regularity, and even extremely prestigious people are implicated, either directly or indirectly (see, Danish Medical Research Council, 1993; Rothenburg, 1994; Schachman, 1993).

The definition of scientific misconduct from the US Office of Scientific Integrity Review is: "Fabrication, falsification, plagiarism or other practices that seriously deviate from those that are commonly accepted within the scientific community for proposing, conducting, or reporting research". The phrase: "that .... community" has caused considerable problems, and so has been dropped in some very recent definitions. For instance, the Office of Research Integrity (ORI) now uses the following definition: "Research misconduct is plagiarism; fabrication or deliberate falsification of data, research procedures, or data analysis; or other deliberate misrepresentation in proposing, conducting, reporting, or reviewing research" (ORI 1993). This does not include errors in judgement, errors in recording, selection or analysis of data, differences in opinion of interpreting data. Misconduct is unrelated to the research process. Fabrication is making up data or results. Falsification is changing data or results.

A.K.'s experience of the Ruakura publications committee was that an inverse law applied; the severity of criticisms was inversely proportional to the number of papers published by the reviewer. The most severe criticisms came from those who had published the least. Reviewers were either equally critical of their own work or didn't want those competing for promotions to have a better scientific output. Quantity of output doesn't necessary equate to quality of work and there may be a need to emphasise quality instead of quantity of output (Editorial, 1989). Authorship is a topic in itself, and different schemes have been devised to assign credit (Segal, 1993).

Plagiarism is using ideas or words of another person without giving appropriate credit, to steal or pass off as one's own ideas or words of another, misappropriation of intellectual property (this may also apply to peer reviewing of grant applications). For scientists, credit for research work is important. It serves as a form of currency for obtaining jobs, promotions, grants and prestige. Not surprisingly, credit for original ideas as well as for the end result of painstaking experimentation, data collection and mustering of arguments is zealously guarded. Plagiarism is the most blatant example of stealing credit. It is much more common than is usually recognized. Closely related to plagiarism is faking results, which in effect claims credit for work not done.

Other concerns that fit into this area in addition to the fabrication and falsification of data include failure to acknowledge the contributions of key people in research (including juniors within one's own laboratory). For example in the USA, Dr Gallo claimed to be a codiscoverer of HIV, but it has been shown that the virus they isolated came from a French cell line (Chang et al., 1992). The Gallo case was eventually dropped after much debate by the Office of Research Integrity, however in their first 18 months until then they had made unprotested claims of misconduct in 16 out of 22 cases it was concluded that there was misconduct. Another 23 cases brought to them were closed with no finding of misconduct. Only in six cases did scientists appeal their conclusion of misconduct (ORI, 1993).

Other examples of misconduct include: rigging experiments; deliberate misinterpretation of results; quoting only one's friends or those who are not seen as one's direct competitors, being selective in what one writes in reviews concentrating only on the work of one's own laboratory; manipulating data and leaving out inconvenient data; use of too few data; premature disclosure of data/concepts, especially in public media; using data of others not yet published; dubious and excessive, unsubstantiated, or unwarranted claims; lack of confidentiality in reviewing; conflict of interest of various types, including reviewing, industry contacts; authorship of papers; inadequate monitoring of postgraduate students.

From a survey of misconduct in the USA (Swazey et al., 1993) these incidents appear to be rather common, and this view was supported by personal observations of the authors and comments by the participants in the workshops. The reasons behind all such fraud an attempt to get on, to beat someone else to the post, to keep one's job, to get a promotion, to obtain another grant, to make a name for oneself, to protect one's own reputation etc. However, the major consequence of all fraud is deception; time and energy are wasted, money is spent in fruitless endeavour, people are distracted from what might be more important lines of investigation, and in the end the reputation of science is besmirched and is degraded in the eyes of society. The essence of the scientific method is threatened. Why should people trust scientists, their results and their pleas for money, if money is wasted by scientists deceiving others: The very rationale of science is placed in jeopardy.

Misconduct/fraud has to be distinguished from errors in data collection and interpretation. It is easy to condemn obviously unethical practices, but the far more difficult question is to ask what can be done about them. Cases of fraud or plagiarism are generally brought to light by 'whistle blowers'. Unfortunately, these are invariably insiders, and are very often junior staff or postdoctoral fellows. Consequently, they are in a highly vulnerable position, especially when their accusations are directed at a professor or dean, or at the laboratory chief, and occasionally at a Nobel laureate. And so it comes as no surprise when one realizes that many whistle blowers have suffered immensely - losing jobs, grants, and any future prospects in their research areas or even in science at all. They may be victimized and suffer catastrophically, even when they are ultimately proved to be in the right.

To date, universities and research bodies have proved notoriously inept at dealing with accusations of fraud. Such accusations are generally ignored or contemptuously dismissed, the accusers may be threatened, and ranks are closed to protect the prestige of the university or research institute. In acting like this, institutions are espousing the most unethical and irresponsible of responses, and simply serve to damage their reputations and create a climate in which fraud can flourish.

This is not an easy area, since fraud busters may themselves prove as much a menace as fraudsters. Nevertheless, any ethical response has to investigate seriously the possibility that fraud has been committed. To this end, universities and research institutes should establish procedures for investigating fraud allegations, and have guidelines for doing this. Standing committees need to be set up to protect research programmes from fraud, to protect the interests of whistle blowers, and to investigate thoroughly any cases brought before them. Fraud committees should occupy as prominent a place in research institutions as do human ethics and animal ethics committees at present. In other words, user-friendly systems need to be put in place for ensuring that ethical stances are encouraged, and unethical stances can be thoroughly and openly investigated. Fraud then is a deeply ethical matter, according to which honesty and integrity are seen as crucial moral values.

2.2 Peer Review and Seeking Funding

Peer Review is an assessment of the person or of the research of one fallible scientist or project by another scientist or group. We can see cases of reviewers delaying publication of competitors papers to allow theirs to be published first (Maddox, 1992). Publication may even be prevented, e.g. Sumner - peer review of his urease discovery prevented publication for 20 years. He eventually received a Nobel prize for this discovery. Another recent case involved the suppression of a medical report on leprosy by one of the senior scientists involved (Jayaraman & Mervis, 1992).

Peer review is a useful process but on occasion can give the wrong answers. In the New Zealand context, the review might be by a funding or commercial competitor because of the small size of the science population. Who has struck incorrect comments on funding bids from research funding agencies' reviews? In New Zealand the Foundation for Research, Science and Technology (FfRST) is the principal funder of "public good" science. Because the NZAS received critical comments and feedback on the FfRST funding system a professional survey was undertaken to obtain comments on the system in 1991. The results (Scinet Sept 1991) found the system was considered neither suitable nor working effectively. The future entry of universities into bidding would remove independent referees. How could Crown Research Institutes (CRIs) manage science when little control of funding? CRI staff with no tenure vs university staff with tenure; a very limited incestuous pool of referees. There was then an inability to challenge referees comments, no justification was given for rankings, and some referees comments were either incompetent or biased. There was a great need for better communication between FfRST and scientists.

A repeat survey in 1992 (Scinet Sept 92) was described as antagonistic by FfRST. It found a high level of dissatisfaction with the system continued. This was probably partly due to the small amount of money available size of the PGSF and FfRST trying to find other reasons as to why projects were not funded. Even successful applicants felt they were underfunded. FfRST picked the worst referees comments, and some contained errors or were made by someone who didn't understand the research. Scientists had no opportunity to correct errors. A belief exists that some negative comments were made by commercial competitors. Another concern is the exposure of good ideas to competitors. When submitting papers for peer review competing scientists may steal ideas and try to publish first to beat system, but in the grant application peer review stage, before research is done, it is easier for ideas to be stolen.

Second-rate research is another issue. Very often one can make the categorical assertion that second-rate research is unethical research. Research that wastes animals simply because far too many are used, or which puts human beings at risk or discomforts them with research that has no reasonable prospect of making any significant advances, is unethical. It is wasting animals, it is wasting the time and energy of people, it is wasting money; this has elements of being unethical.

But how does one determine good, mediocre, and poor research? Should all research be peer-reviewed in some way before it commences (regardless of grant money) and beyond what is presently done via ethics committees? Does peer review help or hinder? Who does the peer reviewing - other scientists in the field or should it include representatives of the community?

Under the competitive bidding system the scientists who put together a successful bid for funding to FfRST may not "own or control" the resulting funding. The science providers or Crown Research Institutes (CRIs) (in New Zealand) may allocate the funding to another scientist not associated with the bidding to undertake the research. Similarly, another scientist whose track record contributed to a successful outcome in a FfFST bid located in a different CRI was later told that the research would be undertaken by staff in the other CRI, which would also take over the funds. FfRST seemed unconcerned about these changes, despite the ethical issues.

Ownership of information is a key issue for scientists. The workshop participants were often confused about their own rights to information generated through their own research. The new commercial environment makes it clear that commercially funded work belongs to the funder, which is the case of public good science funding research. However, the New Zealand government is attempting to exert pressure to encourage more private sector R&D. New Zealand has low private sector R&D compared to the OECD average. In these cases the results may be confidential to the client and owned by the companies.

To clarify professional values we could compare science and business ethics. The days of 'pure' science are numbered by the new context, which is increasingly more dependent upon business context. There are some new issues raised for ethical science. This has also brought more emphasis upon responsibilities to others, for example, who owns information and are scientists "gagged"?

Questions to consider:

Q1. What is ethics? Is it just the attitude, "I'm just doing my job?"

Q2: Think of some moment in the last month when doing your job has brought you into tension with your own values or some community value?

Q3: What we might mean by ethics in your profession or work.

Q4: Collect ideas: What is the work of ethics? Ethics about normal situations - reflect upon: how we structure our life together? What is important? What values inform our thinking? In groups, explore the + and - of each model. What were the issues for you in these exercises?

Case Studies

1) You are a programme leader in a Research Institute and having done well with research contracts, have a new scientific position available on your staff. You interview a prospective employee who for the past 7 years has done some excellent work for another science organisation with whom you are competitive. The job applicant is very specific in his criticisms of the other organisation he has spent the last seven years with. He tells you about their research strategy and reveals some staff conflicts and personality clashes. Finally he pulls out of his briefcase some papers detailing preliminary results which are very useful to you. He says, "I think you'll be interested in reading these over. Just be sure I get them back. Drop them in the mail tomorrow when you are through with them." What do you do about hiring this man? How do you justify your decisions and do you take any further action?

* Most of the participants would not hire such an applicant, as they did not like such a character, and lack of loyalty and confidentiality. They were more divided over whether they would look at the papers offered.

2) Inside your department there are a number of scientists becoming eligible for promotion, including yourself. A rumour has started about the past conduct of one of these scientists, suggesting that he used data of graduate research students and published it without making the graduate students coauthor. It is suggested that he may have done this quite often. This rumour is sure to lower the chances of promotion for the scientists concerned. As a colleague you have mixed feelings, what course of action (if any) would you take with regard to the rumour and the promotion? What would you do if you were the scientist and aware of the rumour about your behaviour?

* Some of the participants would try to find the truth, others suggested this was the responsibility of the institute director.

3) You have just moved to a new position at a new institute. You have some unpublished data which is largely the result of your work, but was conducted at the former institute. You had a disagreement with the researchers at the former institute so you are not on very good terms with them. They do not want to publish the data. Should you publish the data as a single author from your new position? What proportion of work entitles a researcher to being a coauthor on a paper? Do you measure this in time, effort, ideas, or as a proportion of these?

* In many institutes the data will be the property of the institute. The dilemma of writing up work done at a previous institute, and arrangements with coauthors was common.


3. Science and the community

3.1 Science in public

Science is conducted within a moral community: an internal one, the scientific community itself, and an external one, society. It is obvious that this interaction will cause problems unless the values of both communities are identical. Opinion surveys show that over some issues there is very similar opinion, for example genetic screening or genetic therapy, whereas for others there is some difference, such as the degree of benefit or risk perceived from new technologies (Macer, 1992). Both communities are heterogeneous, but similar diversity may be found in both (see papers in this book on survey results). Tension within one community may occur, and tensions between the two communities are inevitable.

The way we approach ethical issues, and the questions we ask, are influenced by context. For example, Aristotle spent a lot of time concerned with the question "What is the good life?", by which he meant how should males relate together appropriately, how should society be structured?. He made some assumptions which his context did not question, e.g., women and slaves were always subservient, not quite fully human. Social attitudes continue to have an influence on types of research carried out, and the "use" of certain populations. For example medical researchers have had a long history of using socially marginalised populations for research, populations such as black rural communities, psychiatric patients, orphans, and mentally handicapped persons (Rothman, 1991). Some of the cases were also broad, as seen in radioactive experiments using soldiers and civilians in a variety of countries, as became controversial at the end of 1993 in the USA. Research on effects of radioactivity, and nuclear testing were cavalier towards indigenous peoples in Australia and the Pacific, especially in the 1950s.

Ethics is concerned with ways we structure our relationships - what values are accepted/ affirmed/ enforced by society? Scientists are involved in a number of different relationships: they are participants in society, not detached or separate - therefore they have the same responsibilities as any citizen. We are a "particular" person in a pluralistic/multi-cultural/multi-valued society.

Scientists are also part of a profession. From medicine we can think of many issues - confidentiality, HIV positive patients; informed consent; when to withdraw life support; euthanasia; research guidelines. Many of these are explored in consultation with the community, but the profession should accept responsibility for making sure the conversation happens. Other issues include ownership of information / profit / credit / competition / forgery / plagiarism / types of research.

How the profession relates to the rest of community at the local, national, and international level, is driven by the community's perception of the profession. Therefore we need to hear from the community. There is already recognition that it is not appropriate to make some decisions alone - a desire to share responsibility with the community. For example, defining what core health services and priorities are, what is appropriate research, the use of animals, what uses research should be put to. There are social responsibilities in the choice and use of research. Some basic areas of responsibility include:
a. those related to the acquisition of scientific knowledge
b. awareness of the potential for misapplication of their findings
c. to make society a partner in the management of knowledge, bearing in mind the interests of present and future generations
d. need to actively value human beings and other life

One of the basic ideals of society is to pursue progress. The most cited justification for this is the pursuit of improved medicines and health. At the same time to cause harm is unethical, especially if insufficient safety and care was taken. A failure to attempt to do good, is also a form of doing harm, the sin of omission. There is a strong ethical reason to pursue further research into ways of improving health and agriculture, and living standards.

The risks and benefits may come to both scientists and the general public. The public exposure, means that the public needs to be involved in deciding what is an acceptable degree of risk or benefit - something which is often difficult to balance. It is therefore useful to look at what perceptions there are of science by others, but even the perceptions may differ considerably between each individual scientist.

People perceive both benefit and risk from science and technology, as shown in surveys from many countries of the International Bioethics Survey (Macer, 1994; see other papers in book), and previous surveys in New Zealand (Couchman & Fink-Jensen, 1990) and Japan (Macer, 1992) for example. Technology that touches life is perceived to be as worthwhile as technology which does not directly affect living organisms, but people may perceive more risks from technology that directly affects living organisms than from those physical science developments which do not. This is similar internationally.

Public opinion surveys show that there is less concern about genetic manipulation of plants, than microorganisms, but there is substantially more concern about the genetic manipulation of animals and highest concern about that involving human cells. Scientists perceived more benefits from genetic manipulation of all organisms than the public. High school biology teachers perceived both significantly more risks and significantly more benefits from genetic manipulation than the public. Respondents from all groups cited numerous and varied examples of their reasoning for acceptability of genetic manipulation, and their perceived benefits and risks from genetic manipulation, and their concerns about consuming foodstuffs made from GMOs. The views of the public, students, high school biology teachers, scientists and academics in general were very similar for many questions, as was the reasoning - basically you cannot educate people not to see any risks.

The situation may arise where a research establishment wishes to carry out some form of genetic manipulation that the society does not approve of. If society is represented by government agencies, the most that can be done is that the scientific community argues its case as best it can. It may or may not win. Quite clearly, though, if legislation is passed banning some form of genetic manipulation there is nothing more that can be done, except continuing to argue the case. It may be that there is nothing in law preventing some form of manipulation, but if groups within the community or the local council objects to the research proposed or being conducted, good relations are desirable and ethical.


3.2 Ethics in Public Relations

Ethics is not a word frequently used by scientists or science managers. They use words such as: 'better from a long range point of view', 'sounder business policy', 'good public relations'. What they really mean is that 'it is more ethical' or more socially responsible. Good Public Relations is about social responsibility and communication - it's not about whitewashing.

In the minds of some, politicians, prostitutes, and public relations practitioners have similar reputations. Like the political process - public relations normally is an ethical and professional occupation, when carried out by skilled and trained practitioners. However, like scientists, PR professionals don't have to have a licence to practise and the profession seems to attract more than its fair share of charlatans.

The role of the Public Relations practitioner in science is to facilitate the communication process by helping scientists to explain their work to the public and by explaining public opinion to scientists and management. Thus, PR people are very much in the middle - probably the only professionals who are - journalists, sociologists, politicians etc are orientated in one direction. The important principles include:
1. PR deals with facts not fiction -problems must be encountered openly and honestly - the best PR is disclosure of an active social conscience.
2. PR serves the public not personal interest - it is unethical to serve the interests only of an individual.
3. PR practitioners must neither lie nor imply - i.e. they must maintain the integrity of the channels of communication.
4. PR cannot afford to be a guessing game - it must be based on sound research - scientific and public opinion research - guessing and intuition is not enough.
5. PR should alert and advise so people won't be surprised - explain problems before they become crises.

The functions of PR in science include publicity and promotion - paving the way for new ideas and products. It includes internal motivation - the image of an institution is its staff. PR nurtures the institution's values, eliminating surprises, monitoring public and staff attitudes, planning for crises. New opportunities - 'market intelligence' issues which may affect the organisation. Overcoming executive isolation - telling it like it is. Change agency - overcoming natural resistance to change.

A definition of public relations is that it helps people to reach decisions based on a mutual understanding of issues. The model is transactional, not manipulative - where the consumer is victim, and not service - where the consumer is king. PR helps people to achieve their goals through effective communication. This brings us back to the first comment about management and their oblique references to ethics. There is an awareness by management that an institute has a responsibility to the community and to science. The PR practitioner has a responsibility to be a kind of corporate conscience on many occasions.

PR is not image making in the sense of creating a false front or cover-up. It is not ignoring the employees - they are the institutional image. It is also not about fabricating a good corporate image - PR can undertake activities and advocate policies and statements which will qualify the institute for its reputation.

PR is used by many groups. Charities, educational, health groups (Cancer Society, Cot Death etc.) now realise the key to success in public recognition and fund raising is good PR. Is science any different? We want people to accept science as culturally important, we want people to enjoy and appreciate science - furthermore we want people to fund it! Promotion is only viable if there is merit in the cause. There have been some very effective PR events in New Zealand, such as red nose day (also seen in other countries), which promotes causes or issues in effective ways. We need to ask ourselves if modern public relations techniques will improve the image of science and scientists? We can think of the television programmes, Beyond 2000, and Science fairs/ science centres, which have a very beneficial effect. Will other forms of communication? Do our scientists have good communication skills and would the public have a better knowledge and understanding of science if our communication was more effective and ethical.

3.3 Animal experiments

In New Zealand, all research and teaching using animals (mainly higher animals) must be covered by an approval from an institutional Animal Ethics Committee before the work commences; if the work commences without approval those involved can be prosecuted before the courts. Animal Ethics Committees include nominees from the NZ Veterinary Association and an animal welfare organisation (e.g. RSPCA) as well as someone to represent the "public interest". Standards change over time as public acceptance of what is acceptable treatment of animals changes.

The benefits to other animals or humans resulting from any treatment, must be weighed against the cost (stress, manipulation, pain, loss of life) to the animals used for research or teaching. Only the minimal number of animals needed to produce reliable results should be used and experiments must be adequately designed to answer the question posed. Are there other ways not using animals of collecting the information required? Is the question being asked of sufficient importance to justify the use of animals?

The ethical questions involved are not simple. Society provides a complete range of opinions ranging from those who believe the use of animals is acceptable to solve problems including disease and genetic defects in humans to those who believe animals warrant the same standards we apply to people and should not be used for any research that may improve human welfare. The different factors that are relevant to the ethical balancing to decide whether a given animal experiment (Porter, 1992) is ethical include: the aim; realistic potential to achieve the objective; alternatives; species of animal; pain likely to be involved; duration of the discomfort or stress; duration of the experiment in terms of the animal's life-span; number of animals; and quality of animal care.

3.4 Relationship to environment

Introducing new organisms to the environment may involve risks. Likewise, we might never be certain to have complete control over the effects of introducing new gene sequences, and with many cases much further experimentation is required before we will be able to ethically allow full scale use of them. Ignorance of the consequences requires caution in using new techniques, and this is an approach seen in the regulations governing the introduction of new organisms into the environment, the basis of quarantine regulations.

In New Zealand the regulation of new organisms, including genetically modified organisms (GMOs) and hazardous chemicals is included under the pending legislation that establishes an Environment Risk Management Authority. That statutory committee and system will replace the interim committee that has been operating under the auspices of the Ministry for the Environment. That committee has approved a number of trials, and required public notification and response to public comments (Macer et al., 1991).

In the international perspective the regulations on GMOs are being relaxed, and some commercial varieties are approved for general growth in the USA and China. People are eating the products from varieties made by genetic engineering, and similar standards should be applied to all products, whether they were made by old (traditional) or new biotechnology. Perhaps the key ethical feature for these regulations is that we should be consistent in assessing the safety of products of science based on scientific knowledge, balancing alternatives, and respect cultural values that may be associated with certain features of them.

Human beings are the dominant species in the world, most directly when they exploit or use resources. Human beings are dependent upon this use, and we need to consider agriculture and aquaculture in particular. Nature includes both agricultural land, cities, and wilder regions - all is nature. Scientists' first responsibility to the environment is in the technologies that they develop that people will use in the environment. At a broader level research themes may be chosen or not depending on the long-term consequences of future use, for example, dependence of agriculture on repeated application chemicals versus biological control.

The precise outcome of interventions in nature or medicine is not always certain. This uncertainty can be called a risk of failure or chance of success. It has taken major ecological disasters to convince people in industry or agriculture of the risks. There are however, scientific methods that can be used to assess the safety of new organisms or chemicals, and agricultural practices. These should be used and refined.

In addition to the development of technology, scientists also have responsibility to ensure safety of their daily research experiments, as well as any production facilities. The chemical accident at Bhopal involved a dangerous chemical intermediate, and there are many other examples. Dangerous intermediates are common in the chemical industry, and the factory users and human error is a factor that needs to be considered. People may misuse any technology. Another consideration is their responsibility as members of the public, not to pollute the environment with substances (chemicals, radioisotopes, viruses...) used in research.

3.5 Science policy choices

A number of relevant issues are raised that are relevant to policy choices. The scientific community does not of necessity have all virtue, it may or may not have an understanding of moral issues. It needs to be educated. We have to ask who sets priorities; simply because scientists wish to undertake some particular work does not give them the moral base to do so. How would we respond if a sculptor wished to erect what to a majority of people were some totally objectionable statues in the main square of a town? Many people and groups would have a say, not just the sculptor. It is easy to appreciate in this instance that many different issues have to be considered, since the statues may have consequences way beyond the interests of the sculptor herself. The intrinsic artistic merit of the statues has to be taken into account, but it is undoubtedly not the only issue for consideration. For instance, social emphasis may be totally opposed to artistic ones. It follows, then, that there has to be wide ranging debate over the scientific directions to be followed, debate that will of necessity be multidisciplinary. This is not surprising since it may be non-scientists who identify salient ethical issues, that may have been overlooked by the scientific community.

Much scientific work is expensive, and the money generally comes from society itself. Why should scientists expect to be funded to do what interests them, regardless of any other matters? Although UNESCO considers freedom of scientific thought to be a fundamental human right, it is a lesser right than some more basic rights. This is a very sensitive issue, and also uncovers some very important questions relating to basic and applied research. This is a matter with far more ethical overtones than we have frequently been prepared to admit.

Very often the scientific community needs help in seeing ways forward. Although technical knowledge is necessary for progress, others such as research grant-awarding committees, or company directors, now exert enormous pressure in determining what is and is not important. Government policy may also have a powerful role. This is not unreasonable, we should seek the right balance between scientific progress and community priorities.

Another issue is the international nature of science. One particular community may ban a certain type of scientific research, but the results of exactly the same research in other countries are international. All too often this is ignored in debates on the relationship between the community and scientists. To ban some research, but then benefit from this same research conducted elsewhere is mere hypocrisy, and unethical distribution of the risks of research when all will benefit. But is there any way to prevent it? International controls and standards on radiation, environmental protection, and possible future controls on human cloning and genetics, can act as minimum guides. These should be developed to avoid the use of scientific tourism to countries lacking regulation.

A further issue is the extent to which the scientific community should censor itself. It should not do everything that is technically feasible, others outside the community may never catch up with it. If it does, it is stating quite explicitly that scientists are nothing more than mere technicians. It is attempting to say that scientists do not have any of the moral values of other human beings. They will do what can be done; they will be controlled by the technological imperative. They will stop when others tell them to stop. The medical profession is suffering enough from exposure of unethical radiation experiments, among other practices, at present - and for good reason.

Scientists have moral values - they exercise these in every other aspect of their daily lives, and so to act differently within their profession is irresponsible. The end-result for the control of science and its activities would be bleak. There is a need for strategies and structures for response within a profession to assist, e.g., the New Zealand Medical Association issued new ethical rules in August 1992. There are a variety of different levels at which ethics can be established, including: law codes, e.g., International Ethical Guidelines for Biomedical Research Involving Human Subjects; Regulation/agreement, e.g., Interim assessment group for the field testing or release of organisms; Professional associations, and not least, by peer pressure.

Questions Q1: What concerns do you have about biotechnology/science? What concerns can be answered by more science and what concerns involve values that are not-scientific - but may still be important to society?

Q2: Should science be regulated by the law? How much freedom for research is necessary to conduct science? How much inhibits useful science?

Q3: What are some of the issues you want to name as important to scientists working in New Zealand?

Q4: Identify issues that need to be addressed by any framework for decision making? What are the values that is important to science to affirm in any framework? What frameworks are already in place? What process is used now in establishing a more adequate framework? Who should decide?

Case Studies

1) A scientist at a small agricultural research company has found a novel gene that can make transgenic plants resistance to salinity. She wants to publish the results openly so that all researchers can use this technology soon without any commercial concerns. However, the company is under financial pressure that may result in closure of the company and if they patent the gene and sell the rights to development to a big international company they will be able to continue research with a secure future. What should she do? Whose decision is it? Is the plant species important?

* Most participants considered the small capital of the company meant agreements with large companies may be necessary, even if sometimes not desirable.

2) You are a scientist working in a research laboratory looking at the safety of a new agricultural chemical. Your standards of safety differ from the director of the laboratory. Although the data given to the government regulatory agency has been sufficient for the government to decide the product is safe for general commercial sale and use by home gardeners and farmers, some data you obtained since then casts a doubt in your mind on the safety of the product to the ecosystem. The laboratory director, also a scientist, considers that the data is insignificant to the case, but you do. The production facility is already built and the product is ready for wholesale distribution. Should you inform the government of the extra data that you think should be reviewed?

* In most companies, if there is a doubt about safety the company will ask for independent review, or shelf the potential product for fear of liability claims.


4. Questionnaire responses

Following are some of the comments representing the scope of issues, not the relative importance placed on them, that were in comments made in a questionnaire to evaluate the workshops.

1.What ethical issues in science are most important to you?


*Honesty in competitive bidding.
*Science in community.
*Questions of ownership of information within the scientific community - plagiarism, multiple publication of the same data, order of authorship. Traditionally a scientist's finding belonged to him, and authorship was what he had to show at the end of the day. How does this fit into the commercial framework?
* Consumer acceptance of genetically-modified foodstuffs; Authorship wrangles; Opportunity for scientists to impact on the use to which 'their' intellectual property is put by their Institute.
* Animal ethics especially genetic engineering. Commercial interests.
*The paradoxical relationship between the perceived need to bow to commercialism and the need for unbiased and public evaluation i.e. the conflict of interests between what is good for society and what is good for personal and corporate interests.
*The need to openly discuss the benefits and risks of science and technology. The need for scientific honesty and integrity.
* Honesty and integrity in conduct of research, in extension of results, and in relationships with colleagues and clients.
* Rights to genetic resources, conflicts of interest.
*Industry and animal based ethical issues will have to be considered together in the future because of the commercialisation of our research community.
*Integrity/honesty Conflicts of interest/bias Ownership/authorship Direct harm to life v potential benefit to people or animals (respect for life) Responsibility (directly to whom/ultimately to whom)
*We need to be able to analyse the ethical basis of science, and to relate individual cases to basic principles.
* Opportunity cost' of government-funded science; significant investment in one area of science will reduce available funding for another area, which may be seen by others as of greater global importance. Dependence of informed ethical judgements on cultural background (the workshop seemed geared to proving the opposite)
* Attempts against animal rights, farmer's and indigenous rights.

2.What personal experiences of scientific misconduct are uppermost in your mind?


* Science in the community and responsibilities of scientists.
* Suspected conflict of interest or bias on the part of individuals or organisations.
* A colleague taking credit for my ideas and not acknowledge them, publication without co-authorship. Charging his work to my code.
* I have heard reports of "sharp dealing" by scientists/administrators to effectively cheat cooperating groups out of public good science money.
* I have no personal experiences of misconduct. I do know of questionable practices, such as the fellow scientist in Oxford whose work was published without problem while supporting orthodox theory. His results began to challenge dogma, and he suddenly found it hard to get his work past the Departmental review committee (which was chaired by authors of the dogma). Similarly there are scientists here who will insist their name appears on a paper where their involvement was minimal, but who will resist inclusion of technical staff...
* None although I have heard rumours of misconduct.
* Selective use of information to exaggerate achievements/usefulness of technology. Exaggerated claims as to potential achievements resulting from prospective funding. Unprofessional behaviour towards colleagues/subordinates in the area of science management/decision making.
* A colleague taking credit for my ideas and not acknowledge them, publication without co-authorship. Charging his work to my code.
*Personal experiences of ethical problems would be theft of research ideas, unnecessary or ill-founded duplication of research for cheap personal gain, and doubts about the integrity of some of the refereeing associated with FfRST bidding.

3.What values do you believe are basic to ethical science?

*Morals.
*I think the requirements can be summed up simply as honesty and integrity and the necessity to treat others and be treated fairly. With these criteria met there would be few ethical problems or dilemmas in science.
*Honesty / Respect for life and the environment. Loyalty to the funders (without compromising the first two values).
*Honesty, scrupulous integrity, curiosity.
* Science is essentially a search for the truth. All behaviour by scientists, either in their research or in their relationships with their colleagues and the wider community should reflect this. But the truth is not always as easy to arrive at as we believed in our idealistic undergraduate days. At the most immediate level we have the problem of probability, and the need to make judgements. As we move further from easily observed phenomena, truth becomes harder to define. Electron micrographs and electrophoretic gels move the scientist some steps away from his material, and we need to have faith that our techniques are giving us a fair representation of reality.
* I am a Christian and a scientist. I found the area of `fundamental principles' to be the least satisfactory part of the workshop. I would be interested to hear from professional ethicists how different religious and cultural backgrounds influence the nature of those 'fundamental principles'; I suspect even the four principles suggested would not find universal acceptance amongst scientists, let alone the world. Honesty, altruism, selflessness, humility, respect for life, respect for other's views, global sustainability, exercise of free will, responsibility for the weak/sick/poor, rejection of corruption, protection against abuse of power: these would all figure in my list of basic ethical principles; but I haven't thought too hard about that list!
* Morality i.e. the consideration of potential impact of science. Honesty
* A belief in the necessity to adhere to the scientific method in research, whether "public-good" or "commercial". Honesty and integrity in conduct of research, interpretation and extension of results, and in relationships with colleagues and clients.
* Honesty, transparency, understanding of the rights of others.

4.What ethical responsibilities do you believe scientists and their employing organisations have?

*In short, they are responsible and should be held accountable for everything they undertake. Scientists are held in high regard by the general public and it is everyone responsibility to ensure this attitude does not change.
*To not cause harm knowingly or misrepresent results in the interests of personal profit. Potential benefits should outweigh potential harm, to the best of their knowledge. To respect life and quality of life when planning what research to do and how to go about it. To admit, and if possible remedy, harm resulting from research results, whether due to errors or unforeseen applications, etc...I think the public would have more faith in a profession which is seen as fallible, but with good intentions and willing to fix its mistakes, than in one that sets out to be godlike, but turns out to have clay feet.
* Scientists and their organisations have an ethical duty of service to their clients, whether public or private. They must deliver first class science at the appropriate level. "Strategic" and "commercial" science should not have different quality standards. Scientists have an ethical duty to their employer, to carry out their allotted tasks in a conscientious way, to protect the intellectual property of the company, and to obey legal and ethical directions. Scientists and their organisations have an ethical duty to promote and assist public debate in areas in which they have special knowledge. Scientists are people, and must finally be true to their own ethical beliefs. Should a scientist find that he cannot ethically comply with the policy and directives of his employing organisation, he should offer his resignation. This, however, is a lot to ask of scientists in the current social environment.
* I believe that scientists have a duty to perform within whichever constraints are stricter: those of Society or those of the scientific community. There will be times when Society couldn't care less about an issue but the informed scientific community is aware of serious principles; the reverse is also true, there will be times when scientists' familiarity with an issue allows them to become blase, whereas Society would definitely demand more cautious action.
* Consideration of the wider impacts of some research. Unbiased evaluation of new techniques, genotypes etc etc, especially where there is commercial interest. Protection from commercial bias e.g. some sort of committee to prevent commercial interests taking advantage of science or funding being cut and/or publication prevented in the event of commercially unfavourable results.
* To be truthful at all times. To be impartial.

5.The Royal Society is investigating the development of a code of ethics for New Zealand scientists. What broad principles or more specific guidelines of scientific conduct do you believe should be covered?

*It is necessary to do so. Some things have to be covered, for example, authors and their orders for the publications. Principles for the cooperation; science and laws; animal experiments and so on.
*Requires formal membership and formal ethical structures for enforcing ethical disputes, and a funded base to do this.
*Both animal and human ethical problems should be included to some degree. Also that of employers responsibility in relation to the type of experimentation undertaken and the results from these studies.
*Conflicts of interest, which may be common with such a small pool from which referees and reviewers can be drawn. Ownership of information, importance of life and quality of life for everything from fruit flies to human beings
* It is important that broad ethical principles are defined before rules for specific etiquette for individual situation is formulated.
* I would like to see the development of a set of principles for use in NZ by scientists for assessing the ethicality of any particular action. The discussions involved in drawing up these principles would be valuable in themselves. I am more uncomfortable with movement beyond description to prescription; drawing up a set of guidelines for scientific conduct risks giving an unmerited air of legal infallibility to issues.
* Responsibility of reviewers (confidentiality). Guidelines for commercial contracts. Guidelines for authorship.
* Universal moral codes are applicable. Specifics include in particular points mentioned above (plagiarism, predicting effects, respecting others rights)

To draft Royal Society of New Zealand Ethics Code (Developed after the series of conferences described here, but independently of the authors of this paper.

6.Did you discuss the workshop with your colleagues?


One said no, all others said "yes" or the following:
*Not yet, but it may influence discussion of research proposals
* Yes, at a group meeting. Suggested the issue of ethics was important to us as scientists; people were interested to know what we had discussed.
* Yes, but mostly amongst those that attended the workshop.
* Yes. It made me realise how much standards of behaviour have fallen in the last 8 or 9 years!

7.Do you think the workshop helped you to become a more ethical decision-maker?

*Yes, the workshop is important because we have ethical problems and we need to learn how to solve them.
*It has broadened my outlook on the definition of ethics.
*It certainly helped although attendance was low and it would be useful if others gained the same benefits. Those whose conduct is less constrained by ethical considerations are those least likely to attend such a workshop. Disseminating the material more widely so that everyone is challenged by the issues raised would be useful next step.
*I don't think it has affected the decisions I make, but it has helped me to look at the ethical component more analytically (rather than relying on feelings) and to explain the issues involved to others
* No, although it did make me more aware of some of the ethical issues.
* Probably not, but I have had a very keen interest in ethics and science, particularly agricultural science, for quite some time, so am well aware of most of the issues raised in the workshop. Consequently, while it was good to hear others views, I was probably in agreement with most of issues.
* I supposed it helped in giving the feeling that you are not isolated in trying, and this is a very positive outcome.

8.Do you have any other comments?

*I was disappointed with the amount we had to generalise the workshop to cover the material. At the same time, I feel that public relations was over emphasised and somewhat out of place with what I expected to be discussed in this workshop.
*Different people have different ethical principles. Sometimes I don't know how to judge who is right, who is wrong. I hope we establish the consultant system and help people who have questions and problems.
*By and large the workshop did not really get to grips with actual issues: it was a far more theoretical discussion of academic interest to the participants who were interested, with no real teeth to the outcome. One of the central issues we touched on, but never grappled with, is in our new competitive funding environment, there is an unresolved conflict between the scientific ethic of cooperation and freely sharing information and the ethic of user-pays, competitive business model.
Secondly, scientific ethic, is inexorably linked to the fundamental ethic of a culture: its world-view. We didn't discuss this, nor who should police ethical infringements in science: we are quite dissimilar from the medical profession in regard to the ethical structures in place. People with an individually high view of ethics and science produce good science: those that don't cut corners.
Thirdly, scientists have not been notably different in voicing concern about ethical issue that do not directly concern them: the world population problem is the prime example. I enjoyed the workshop, found some things relatively novel, and it prompted me to think. But we have a long way to go.
* A few thoughts: Questions usually thought to relate to science ethics can be classified as follows: Internal Science Ethics; sins against the Truth (e.g. falsifying data); sins against the science community (e.g. plagiarism); Social Responsibility in Science.
The application of science - what is discovered cannot be undiscovered. Do we do what we believe is best for society, or what society thinks is best? (An answer to this question is necessary before Darryl Macer's surveys of opinion can be shown to have any relevance to ethics.) When we balance benefits against risks, do we mean "real" (i.e. as perceived by us) or "perceived" (i.e. by the community) benefits and risks?'
Politics of Science. How much of our desire to keep the public/decision makers informed is really altruistic, and how much is an attempt to increase public and private spending on science. Do we extol the values of science without any thought of securing our personal futures or expanding our organisation? To what extent is the science community a repository of collective wisdom, and to what extent is it just another pressure group? Is it easy to answer this from the inside?
* I really enjoyed Gareth Jones approach; logical, clear and instructive. If I were running the workshop (which I am not qualified to do!) I would spend more time on his afternoon approach. Define some `underlying principles'; take a case study, such as genetic modification of foodstuffs; and apply each principal, exhaustively, to see how it relates to the case in all its ramifications. We would practice identifying all the parties involved in the issue, even indirectly; and see the issue from each of their perspectives. We would identify how other ethical systems apart from our own framework might reach a different conclusion. And we would arrive at a cautious conclusion on what might be an ethical approach to dealing with the case study. I enjoyed Alan Kirton's presentation of ethical issues in NZ; he managed to stimulate some useful discussion particularly about authorship.
* As a first attempt into what is a difficult area, you are to be congratulated on the workshop. However, I was concerned that some aspects of the perhaps more relevant areas e.g. genetically modified organisms, biotechnology, agriculture and ecology, bioethics etc were treated hastily. In contrast, some topics such as perceptions of science, public relations, funding and peer review were discussed too much. The issue of animal experimentation and exploitation was glanced over, yet to many society is changing its views quite radically, perhaps even akin to the movement to accord women greater rights over the last few centuries.

5. Conclusion

This paper does not attempt to define scientific ethics, but rather looks at the ethical issues that science raises. The first thing we must do for discussion of the ethical issues is to recognise them. After that we must examine what values are important to science? These include: truth, creativity, inquisitiveness, power, love, justice, and consequences. A framework for ethical decision-making must include recognition of basic ethical ideals such as the freedom of thought, justice to all, benefit and risk. Some universal ideals that apply to scientists are discussed in the first part of this book (Macer, 1994).

Ethical issues include more than feelings, rather balancing expected consequences can be scientifically evaluated in many cases. Behaviour or etiquette has roots in ethics. Science is claimed to be a profession by many, with its own set of guidelines on professional integrity. Certain duties and standards apply in relations between scientists and other members of the community, and to the wider environment. The question of accountability is raised and what bodies should oversee the conduct of scientists. In the New Zealand case, the Royal Society is formulating guidelines, but should each country have an equivalent of the US Office of Research Integrity? In certain cases such an independent body may be needed, in cases where institutional resolution is unsatisfactory to all involved.

Specific codes of conduct for issues such as authorship of papers, funding, contracts, publishing, ownership of data, availability and control of data, may be needed, applying ethics. Guidelines already exist in many countries for specific issues that are particularly sensitive, such as animals, or of serious risk to humans and environment. An independent committee, such as the US Office of Research Integrity, may be necessary to police extreme cases. However, the most effective and ideal way for ethical science is for every person to have consciousness of the issues, their own responsibilities, and ways of balancing conflicts. We hope that these issues are debated further by the participants, readers, and spread through the broader scientific community.


References
Chang, S.-Y. et al., "The origin of HIV-1 isolate HTLV-IIIB", Nature 363 (1992), 466-9.
Couchman, Paul K. & Fink-Jensen, Kenneth. Public Attitudes to Genetic Engineering in New Zealand, DSIR Crop Research Report 138. (Christchurch: DSIR, 1990).
Danish Medical Research Council, Scientific Dishonesty and Good Scientific Practice (1993).
Dingell, J.D. "Misconduct in medical research", NEJM 328 (1993), 1610-5, 1634-6.
Editorial, Endocrinology 124(1) 1989.
Jayaraman, K.S. & Mervis, J. "Leprosy researchers lament suppression of Indian paper", Nature 361 (1992), 673
Macer D., Bezar H. & Gough J., Genetic Engineering in New Zealand: Science, Ethics and Public Policy, approx.90pp., ISBN 1-86931-076-4 (Christchurch: Centre for Resource Management, paper No.27, 1991).
Macer, D.R.J. Attitudes to Genetic Engineering: Japanese and International Comparisons (Christchurch: Eubios Ethics Institute 1992).
Macer, D.R.J. Bioethics for the People by the People (Christchurch: Eubios Ethics Institute 1994).
Maddox, J. "Conflicts of interest declared", Nature 360 (1992), 205.
MAF Policy Paper 112, Tentative Proposals for an Animal Welfare Bill (1991).
ORI (1993) Science 261 (1993), 148; Science 262 (1993), 1202-3.; Science 263 (1994), 20-2.
Porter, D.G. "Ethical scores for animal experiments", Nature 356 (1992), 101-2.
Rees, D.A. "Time for scientists to pay their dues", Nature 363 (1993), 203-4.
Rothenburg, L. "Scientific misconduct - not just someone else's problem", Trends in Biotechnology 12: 35-9.
Rothman, David Strangers at the Bedside (Basic Books 1991).
Schachman, H.K. "What is misconduct in science?", Science 261 (1993), 148-9, 183.
Segal, J.Z. "Strategies of influence in medical authorship", Social Science & Medicine 4 (1993), 521-30.
Swazey, J. et al., American Scientist 81 (1993), 542-52.
To recent news and references on Scientific Ethics
To next chapter
To contents list
To book list
To Eubios Ethics Institute home page

Please send comments to Email < asianbioethics@yahoo.co.nz >.