The Cattle Passports Story
There are two generic types of farming – arable and livestock. Arable is high-tech, capital intensive and science-driven. Drive around East Anglia and see the arable industry at work. You see thoroughly modern, well-organized and efficient enterprises. That isn’t to say the subsidies are not worked as intensively and intelligently as the soil. In the Growlink story told in the Teleputer section of the Archive, it can be seen that the system was aimed at bringing IT to arable rather than livestock farming. That was in the early 1980s.
Beef farming, where the bulk of cattle are to be found, can be divided into two parts. Premium Breeds were, of necessity, closely monitored, science-based and well-documented with good quality control even in the early 1990s. The much larger general beef production sector varied from the very well organized to the barely satisfactory. In the early 1990s the small hill farmer represented a significant part of the beef industry and tended to have little technology and restricted resources. Again subsidies were an intrinsic part of a farming enterprise but smaller farmers struggled with the paper bureaucracy that was the backbone of the subsidies world.
From a public policy point of view livestock farming is very different to arable farming in one important respect. Livestock are prone to diseases that can decimate animals and that can severely impact the food chain sometimes with disastrous effect. In the early 1990s the livestock industry needed excellent information processing and communications systems. Sadly such systems were either scarce or non-existent.
In 1986 a mysterious brain disease was found
for the first time in a cow on a West Sussex farm not far from my own. The
In 1990, there were 14,407 cases and the UK government declared that British beef was ‘safe.’ A politician allowed his young daughter to eat a hamburger in a photo-op. The government introduced new record-keeping requirements for beef farmers. In 1992, there were 37,280 cases and growing public concern. This was the peak of notified BSE cases. In 1995 there were only 14,562 cases but that year 3 people died from a new disease named ‘variant Creutzfeldt-Jakob Disease’ [vCJD]- a fatal brain disease. No-one knew where this disease came from. In 1996, the link between BSE and vCJD was proven. BSE in beef had jumped species to become vCJD in people. Virtually everyone in the UK who had eaten beef was at risk of an incurable fatal disease. There was public panic. The European Union immediately banned UK beef exports. UK politicians were under intense pressure to ‘do something.’ In 1996, there were 8149 cases and 10 deaths.
The UK government response was to do two things; firstly it ordered all at-risk cattle to be culled. Over 5 million cattle were destroyed. Huge funeral pyres dotted the British countryside as the carcases were burnt. Secondly, the government decided to set-up a system to trace all cattle from birth to death. In June 1997, the UK government announced that it was going to establish a ‘British Cattle Movement Service’ which would launch a ‘Cattle Tracing System’ in September 1998. The new Service would be part of the Ministry of Agriculture, Food and Fisheries [MAFF]. The new system would have a public deadline and would be under development during a period of unprecedented public and political hysteria. Careers were on the line. Fear was palatable. There were 4,393 cases and 10 more deaths.
The starting point for the new system was a blank piece of paper. MAFF did not have the capability to design, develop and implement a system of such magnitude in 15 months. Even after culling 5 million cattle, there would still be 6 million plus to trace.
There was no capability within the Civil
Service to deliver this project. During the 1980s there had been an exodus
MAFF and its consultants went to work. They needed a system that would uniquely identify each individual animal from birth to death and trace the movements of the animal throughout its life. They needed to be able to collect the data, organise it, put it into a database and process it as required. What was the main problem? The problem was collecting the data, otherwise known as data capture.
There were two main options for data collection- electronic or paper. Electronic would have meant putting micro-chips into individual animals [as with horses], using handheld readers to read the chips and record the variable data which would then be downloaded to a PC and then on to the internet. A database would hold all the records for the individual animal. All the technology to do this was available in 1997. The other option was to build a paper system and then computerise it.
For reasons that have never been properly explained, the second option was chosen. Some said it was because time was of the essence. Others said it was because small hill farmers had no IT competence and no equipment. One view was that the government would have been forced to give IT equipment to over 20,000 farmers and that would have caused public outrage because the farmers were widely blamed for having caused the problem in the first place by feeding ground-up carcases to ruminants. Eventually the government papers will become available under Freedom of Information and we will have the answers.
In the paper system, each animal was given a unique number and a ‘passport.’ The passport recorded birth ,address and parentage. The passport carried pre-paid tear-out postcards on which changes could be recorded. The cards had to be posted to BCMS within a week of the change. Instead of micro-chips there were two ear tags for each animal. Each tag was a card and it was placed in a plastic or metal ear tag holder attached to the animal’s ears. The unique animal number was printed on the tags.
By electronically processing the postcards and ear tags it was possible to record every aspect of the animal’s life and keep this information in a database. Every animal was therefore traceable. If the system worked the UK population would feel re-assured and the EU would lift the export ban. So the project went ahead at breakneck speed.
It soon became apparent that reading and recognising high volumes of cards and tags at speed would be challenging. MAFF approached the IT industry to solve the problem. The proposition was, “We want to read these forms. How do we do it.” ROCC was given the contract for the OCR, scanning and keying. The system used was a conceptual copy of the 1978 British Rail system re-engineered with new technology. It worked very well. It had the usual set-up problems of high speed scanners. The scanners were reading farmers’ handprint and machine print. The system went live on schedule in September 1998 to great acclaim and to the great credit of the people who delivered it. The populace were re-assured. The EU lifted the ban. The consultants won awards. Politicians took the credit.
The problems became apparent within a few years. They fell into two categories. The first category was data capture. The first law of computing is GIGO – garbage in equals garbage out. There was a 30% error rate on data entry. Data entry works on the principle; enter, verify, validate, repair and then process. The difference between verification and validation is that verification is concerned with ensuring the accuracy of variable data entered and validation is concerned with ensuring that static information is accurate. The animal number is static information There are a number of validation techniques.. The date of a movement is a variable. There are a number of verification techniques for variable data. There are particular problems with handprint but they can be resolved with good design. The bottom line was that farmers were making too many mistakes and were being allowed to make too many mistakes. The second problem was anomalies. The way the paper system was designed meant that a movement of one animal through a market needed 3 separate notifications to BCMS; one by the seller, one by the market and one by the buyer. Each party had 7 days to mail a notification. If one was lost there was an anomaly that BCMS had to rectify.
Data capture is a rules-based activity. Follow the rules and the results are good. If people capturing data do not follow the rules there is no possibility of the system functioning satisfactorily. The entire cost of the Cattle Tracing system was borne by the taxpayer. It cost the farmer nothing. On the other hand the farmer could not access information about his animals on the BCMS database. Some sticks and carrots might have improved co-operation except for those that might be covering nefarious practices. The farmers were a political problem and the politicians had to take responsibility but they didn’t.
Before long the errors began to pile up and the anomalies multiplied. As a result 66% of total staff-time at BCMS was spent correcting errors and resolving anomalies. The House of Commons Public Accounts Committee in May 2004 roasted the Civil Servants for the high cost and low efficiency of the system. It was rather like telling the firemen who had saved the city from a conflagration that they had spent too much money.
Embarking on any project in an environment of mass hysteria is not recommended. Choosing old ways of doing things rather than risking a new technology approach was understandable in the circumstances. Many of the senior people involved were stressed out of their minds and worried about their responsibilities for the success of such a high profile project. It was a huge achievement getting the new organization and system up and running in the time. It worked very well but it wasn’t the most cost/effective solution.
On the face of it, there were two things that could have been done better. Firstly, the senior Civil Servants should have made it clear to the politicians that the system was an expensive, compromised quick fix and even as such very challenging to deliver in the timescale. This was an interim system. They should have demanded a commitment to fund a re-engineered system within 3-4 years. We don’t have the records so there is no way of knowing what the advice to the politicians actually said. Secondly, the Civil Service uses consultants extensively. Consultants are allowed to design systems. Civil Servants also feel they have someone to blame when things go wrong as they frequently do. There needs to be a culture change. The Civil Service and the politicians need to learn 20:80. For 20% of the cost one can usually achieve 80% of the benefits. Simple innovations usually deliver a few more benefits. Consultants are a necessity if the in-house expertise doesn’t exist but their work needs peer review not by other consultants but by working practitioners, who are neither consultants nor members of the IT industry, with practical experience in the methods and technologies that are being proposed. A peer review of the Cattle Tracing System would likely have improved it without delaying it. The lack of adequate peer review in IT projects is a systemic problem across government where the focus is generally on end-user review and participation. Ironically, across government there is huge experience of running complex and successful systems but the expertise is rarely in the place most needed at the time it is needed when new projects are being proposed.
For peer review, personnel need to be recruited for ‘ad hoc’ project- specific teams. They need to be independent, objective, forthright and very experienced. They would serve on short-term assignment, recruited through the learned societies and professional bodies. The taxpayer might get a better deal and the politicians might be spared the frequent embarrassments.
There is another systemic issue. Data capture across government is a huge operation with huge costs and huge implications. It is very hard to find any statistics on actual volumes and actual error rates. When ROCC was running the keying competitions in the 1980s, the best performers were keying at astonishing rates with astonishing rates of accuracy, generally north of 99%. Some government departments will already be getting these rates of accuracy but others will be struggling. The government should have a data capture accuracy standard which should be monitored and published by operating unit. This would shine a light on the issue and thereby encourage best practice.
The final systemic issue is information storage.. All governments store vast amounts of information about their citizens. It is a necessary practice. We live in a complex world. There are some who would require government to construct and maintain huge, monolithic databases to bring together information currently held in different places. On the other hand, there are many people who are concerned with government’s persistent problems with mislaying and losing information that is essentially private. There is also the Orwellian fear of ‘Big Brother.’
Farmers are intrinsically linked to government not just as citizens but also as enterprises involved with public health, disease control, subsidies ,animal movement, animal welfare, environmental protection, health and safety etc. Farmers interact with a myriad of information processing systems and these systems are constantly changing. Monolithic databases are not a solution. Inter-operability standards properly defined would allow different systems to work together and yet individual systems could be changed relatively easily. It is also perfectly feasible to have some standards in database design that would enable stored data to be easily matched to pick up the inevitable inconsistent presentations and typos. Capturing name and address information consistently and accurately is always a challenge. Apart from typos, people can legally and accurately present their names in different formats. There is no reason standards could not be implemented across government but it would probably take 10-15 years for them to be fully effective. There are rarely quick fixes to big problems.
From 1995 to 2000, 80 people died from vCJD. Thereafter the rate dropped to very low levels. No-one knows the long term prognosis for the UK population.
This article is part of the Michael Aldrich Archive that has been donated to the Aldrich Library at the University of Brighton in the section titled ‘Data Capture 1977 - 2000’.
© Michael Aldrich 2011