08.05.2014 Views

Introduction - Uppsala Monitoring Centre

Introduction - Uppsala Monitoring Centre

Introduction - Uppsala Monitoring Centre

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1850, the Lauder-Brunton auriscope in 1862, followed by the discovery of X rays in<br />

1895 expanded the diagnostic possibilities for physicians. The next year saw the<br />

Riva-Rocci mercury sphygomometer, the forerunner of our present machine, come<br />

onto the market. The new century started with the invention of the ECG in 1901 and<br />

the gastroscope in 1932 (Walker, 1990). Blood dyscrasias could, in theory, have<br />

been recognised after the discovery of the microscope by Leeuwenhoek in 1673<br />

since he saw blood films in which absence of white cells would have been<br />

noticeable, but it wasn’t until 1888 that aplastic anaemia was recognised and 1922<br />

before agranulocytosis was discovered, so drug induced dyscrasias are a fairly new<br />

phenomenon. Most of the early ‘doctors’ were untrained; others had served<br />

apprenticeships of varying qualities. Only a few physicians were university trained.<br />

In the UK prior to the mid 19th century physicians did not physically examine<br />

patients, but did feel the pulse and examine the urine (Loudon, 1986). Only a few<br />

physicians or healers would publish books. The translation of symptoms and signs<br />

into their specific pathological causes only evolved as medical science advanced. It<br />

can be difficult to translate old diagnoses into modern terms, e.g. cardiac<br />

depression? cardiac failure. The first mention of individual variation in susceptibility<br />

to drugs was in 49 BC and had not advanced much further by 1874. It wasn’t until<br />

Pirquet recognised allergy in 1907 that real progress was made. Only those with<br />

access to publications would be able to learn from distant experts, so that it would<br />

require a certain level of education and wealth. It wasn’t until 1862, with a thesis on<br />

drug induced skin reactions, that the diagnosis of ADRs was discussed. Liver<br />

function tests were mentioned in 1916 in respect of salavarsan, but these were very<br />

limited and it wasn’t until the 1970s that transaminases were used. The fact that a<br />

particular advance had been published did not mean that its use immediately<br />

became widespread. Some years might pass before the medical elite accepted a<br />

new idea sufficiently to mention it in a standard textbook. It would take even longer<br />

before new advances were used routinely in general practice. In 1960 few general<br />

practices in the UK would have had an ECG.<br />

4. Reporting<br />

There would always have been reluctance for patients to report adverse events to<br />

their healer/physician since there is an implication of blame. The side effects of a<br />

drug might have been accepted as showing that the drug was working. The<br />

salivation with mercury was taken as showing that the evil was being washed out<br />

and therefore the dose was increased until the required amount of salivation was<br />

occurring. People who accepted that human excrement dried and powdered and<br />

then puffed into the eyes was a reasonable treatment for conjunctivitis might not<br />

have been very discerning. Many of the writers comments have been whether the<br />

drug was safe or could it kill, rather than mentioning temporary inconveniences.<br />

When death was all about them patients may have been more stoical and accepted

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!