I'M-Europe Home Page IMPACT Programme Home  Page  Previous Report Next Report

INFORMATION MARKET OBSERVATORY (IMO)


THE QUALITY OF ELECTRONIC INFORMATION PRODUCTS AND SERVICES


Luxembourg, September 1995
IMO Working Paper 95/4
The views expressed in this report are thoseof the IMO secretariat and do not engage the European Commission
Bât. J. Monnet,
Plateau du Kirchberg, 
L-2920 Luxembourg
- Office: JMO B4-020 
Telephone: exchange (+352)43011, 
direct line +352 4301 32889.
Fax:+352 4301 33190
Telex: COMEUR LU 3423. 
Telegraphic address: EURDOC LU 2752 

Contents

Highlights

REFERENCES
LIST OF ABBREVIATIONS

Highlights


This paper was drafted by Policy Studies Institute on behalf of the IMO. Sources includes interviews with quality experts, European information industry press and various online discussion groups, fora and news. In addition, material was taken from the Proceedings of a Commission Workshop in June 1994 on, "Specifying and Measuring the Quality of Information Products and Services" produced by Dr. Norman Swindells. 


INTRODUCTION AND BACKGROUND

Aim and focus

The aim of this paper is to provide an overview of quality issues in the provision of electronic information products and services, and how they are likely to affect the European information industry. The paper covers established electronic products and services, such as online and CD-ROM, and also the Internet. The paper outlines the history and background of the issue, the major concerns of users, types of quality, quality criteria and evaluation, the industry's approach to quality, and implications for the future.

Background and history

Quality is now an issue of increasing importance in the information community. A review of the library and information science literature shows an increasing interest in the subject. In 1994, 393 articles on quality were published. This number is almost double the amount found for four years earlier. In addition, there are an increasing number of seminars on the subject, and quality is a topic that figures increasingly in conference and meeting programmes.

The two main concerns as far as quality in the information sector is concerned are the quality of information products, and standardised approaches to quality management in organisations. The interest in quality of information products and services has been led by information professionals. This movement started in the late 1980s with various groups articulating their problems and requirements. This Working Paper outlines key developments in these activities, as well as the reasons behind the increasing importance of quality in information provision, the concept of quality management and how it can be applied to information products and services by information providers. The paper does not deal with the application of management of quality in libraries and internal information services within organisations, and it is only mentioned in so far as its application leads to increased demand by information professionals for quality in the electronic products and services they use.

The concept of quality

Definition of quality

Quality has been defined in ISO 8402 as "The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs". Other definitions are "fitness for purpose", or "satisfying customer expectations". There can be no absolute measure of quality and indeed the term in everyday usage is very subjective. In manufacturing industry, and increasingly elsewhere, it has an objective meaning namely - there is an understanding between supplier and customer as to what is to be supplied and how much it will cost. It is this latter meaning which is used throughout this paper.

The concept of quality management began in the manufacturing sector after World War II. It is often associated with Japan and indeed several of the early quality enthusiasts found a willing audience in Japan, though they were in many cases American. From that sector came the realisation that in order to achieve customer satisfaction with products (and now with services) action has to be taken on two fronts:

Some general approaches to quality have been developped:

Total Quality Management (TQM)

The aim of TQM is to focus all of the organisation's activities on meeting the expectations of customers and getting it "right first time every time", with continuous improvement given heavy emphasis. Errors and wastage of resources are prevented, and therefore costs are reduced. At the same time, customer expectations are met, so the competitiveness and profitability of the organisation should be improved.

TQM is currently a very fashionable management philosophy. It is not an entirely new idea. There are various gurus, including Americans such as Deming, Crosby and Juran who first developed the ideas behind TQM. There are also a number of Japanese exponents of the concept including Ishikawa and Shigeo Shing. TQM is a customer-centred approach to management.

Customers are not just external, i.e. those purchasing the end product or service; there are also internal customers, and everyone in the "quality chain" from the initial supplier through internal supplier/customer interfaces to the external customer has a part to play. To succeed, TQM requires commitment throughout the organisation from the chief executive downwards and from all employees. This must be backed by the provision of systems to produce quality products and services, including the setting of clear objectives, training and appropriate instructions, material and equipment. There should also be provision of techniques to measure current performance and allow for continuous improvement. This approach is long-term and involves a continuous process of review in response to the customer's needs. All this may well require a culture change in the organisation.

The quality of management

The ISO 9000 series of standards is concerned with the setting up of a quality management system, and gaining recognition for this by certification. The ISO standards are equivalent to the European Norm (EN) 2900 series. The ISO standard for quality identifies the characteristics of a quality management system, and includes instructions for management responsibility and organisation, process quality control, and quality assurance techniques. The standards outline the necessary characteristics of management systems only; the organisations have to produce their own detailed policies and procedures. The systems should produce products and services that consistently meet their specifications, but do not provide those specifications.

Although they were developed for products, these standards are being increasingly applied to services. In fact, ISO 9004-2 is specifically for the service industries. Organisations can be certified as complying with the standards and compliance can be independently assessed, either by customers or independent third parties.

The quality of products

Despite these approaches all human action is subject to error and to expect products or services to never have defects is unreasonable. Manufacturing industry has learned to manage levels of defects within certain limits and it is this management of errors to control the quality of products that the information industry can learn from.

The procedure for continually checking for conformance to a specification is called statistical process control (SPC). It has four component processes:

Statistical methods are then used in order to determine the optimum size of sample which needs to be inspected.(1) In the information industry, such procedures could be used say, for the inspection of database records for errors and are already used by some companies for this purpose. However statistical methods can not be applied to all aspects of the quality information products and services, such as measuring the coverage of a database.

Why quality of information is becoming increasingly important

Informationintermediaries have always been concerned with providing good service to their clients. Evidence for this is shown by the fact that this is written into the various codes of conduct drawn up by their professional associations. Quality management is now beginning to be applied in the information sector by commercial information providers, academic, public, and special libraries and information services. Guides have been issued by various bodies in the United Kingdom, such as Aslib and the Library Association.(2)(3) Library and information services in the public sector have become involved in quality issues by government initiatives such as the UK government's Citizen's Charter. The Nordic countries, especially Denmark, Norway and Finland, are active in quality management programmes. The German government started a campaign in 1992 to promote interest in quality assurance. The Deutsche Gesellschaft für Dokumentation (DGD) held a conference in May 1993 on quality management. Public libraries in the Netherlands have also become involved in the issue. The aim of these initiatives is to increase formal accountability and performance of services in the public sector. The expectations of the recipients of these services have risen as a result.

Organisations are becoming increasingly aware of the value of information and its management for their competitiveness. This concept was taken up by the European Commission in its White Paper on Growth, Competitiveness and Employment(4), and in the Bangemann Report, Europe and the Global Information Society.(5) There is also increased competition in the information industry, and users have a wider choice of products and services.At the same time, information suppliers are becoming more aware of their legal responsibilities, especially since more are now providing fee-based services. The information they use as input is subject to copyright and data protection laws. They are also affected by legal liability for the services they provide. The areas of law that affect legal liability vary according to jurisdictions, but can include contract law, tort and strict liability. As yet, information providers have not been involved in liability cases, but this is likely to change in the future. Even where a contractual relationship is not present, providers are obliged to carry out their work using reasonable skill and care.

A Directive on the liability for products was issued in 1985 based on the concept of strict liability. Strict liability would mean that suppliers are liable to pay for physical damage that results directly from the products they provide, even if the damage is not caused by negligence on the part of the suppliers.

Some information products such as software or CD-ROMs may be covered by the products liability Directive. In October 1990, the Commission also proposed a Directive on the liability of suppliers of services. This introduced a system of with-fault liability. The Edinburgh European Council asked the Commission to review this proposal in the light of subsidiarity. Moreover, negative criticism was received from the Economic and Social Committee, the European Parliament and professional circles. Thus the Commission decided to withdraw this proposal and start a consultation procedure with a view to arriving at new proposals. One way to reduce the chance of a liability suit is to adhere to good professional practice. This includes being aware of the range and content of sources used and using the most appropriate, reliable and timely sources. It is also essential to retrieve all appropriate relevant information. This has implications for the information industry. Intermediaries will only use sources they feel they can rely on, and will demand better quality services.

The rapid growth of the Internet, and its implication for the information industry is discussed in another Working Paper.(8) The Internet Society estimated that the number of Internet hosts worldwide was almost 5 million at the beginning of 1995. The number of Internet users is impossible to calculate, but has been estimated to be at least 20 million. Use of the Internet has so far mostly been confined to the academic community. As is well known, the Internet offers a mixture of bulletin boards and databases covering an enormous range of information from all over the world. The quality of this information is extremely variable, and this problem is compounded by constant additions to the range of sources and by deletion of existing sites. The Internet also offers opportunities for commercial information providers to reach a huge audience world-wide. The anarchic nature of the Internet has raised its own quality problems that could inhibit its potential.

THE USERS' PERSPECTIVE

Types of quality problems

Users of electronic information services can experience quality problems at various levels. Firstly, there is the database itself which is compiled and indexed by the database producer. The second level is the system level. The same basic database can be made available in different forms. It can be loaded by different hosts, or be available in different formats, such as online or CD-ROM. Service providers are responsible for updating their version of databases, and for the system's search and retrieval features. A third level of electronic information services is administrative, and includes documentation, charging and billing procedures, and help facilities for customers. This level is also the responsibility of the service provider. The final level of the service is that of access, which is provided by telecommunications operators in the case of online services and the vendor in the case of CD-ROM products.

Database level

The most obvious quality issue for users of information services is that of the content of databases. Data problems include typographical errors, misspellings, inaccuracies and indexing errors. When spelling errors occur in important fields, or citations are inaccurate, retrieval can be significantly affected. Relevant items are either lost in the system or sources cited are difficult or impossible to locate. Inaccuracies in the text or numeric data itself is especially serious in the areas of financial, legal or medical information when important decisions are being made on the basis of the information retrieved. The absence of a decimal point or the insertion of an extra digit can be crucial. Items in the database may be incorrectly or inconsistently indexed which again may lead to not all relevant material being retrieved. As mentioned above, the use of statistical process control (SPC) techniques could help reduce these errors.

Searchers also often find that databases do not have the coverage and scope that they expect. The database may not cover the subject matter adequately, or key sources may be missing. There may be gaps in the coverage of particular journal, or journal issues may not be included cover-to-cover.

Timeliness is often an important issue for searchers. Databases vary in the frequency of their updates, and the currency of the contents of each update. Some sources are more up to date than others in the same database and between similar databases.

System level

The structure of databases can also present problems for searchers. There is often a lack of consistency in structure and indexing of information between files and even within files as new policies are introduced but not applied retrospectively.

System features also affect retrieval. Features that searchers want, such as proximity searching, left-hand truncation, displayable thesauri, etc., are often not available in less sophisticated services.

Different systems have their own search languages which have to be learned. This is a real problem for inexperienced end-users, and also for professional searchers handling infrequently used databases.

Searchers also experience problems with output facilities. Often output is badly or unsuitably formatted, and frequently relevance ranking is not possible. Printing and downloading can also cause problems if the formats available are inappropriate for client needs.

Administrative level

Lack of, or poor quality documentation, training and customer support is seen as a problem by users of all types of electronic information.

The pricing of electronic information products and services also concerns users. Online pricing mechanisms are often complex. Where pricing is by output, unusable or poor quality output caused by the system itself is frustrating and costly. CD-ROM purchasers often have to return superseded disks.

Access level

Purchasers of CD-ROMs and of online services frequently have problems getting them up and running. In addition, telecommunications can pose problems for online searchers. For many years, EUROLUG ran a survey of problems experienced by users when logging on to online systems. A worryingly high percentage of attempted online connections failed due to telecommunications failure of one sort or another. Telecommunication faults can lead to corruption in data or sudden loss of access. When searchers are being charged for connect time, slowness or sudden disconnection will cause difficulties. Difficulties of access and lack of user friendliness is seen as a quality problem by users of online services.

The consequences of quality problems for users

The CIQM carried out a survey of database users and their problems. In a press release issued in February 1995, some of its findings were reported. The average database user, carrying out 10 to 15 searches a week, can expect to encounter two or three quality problems per week. In 70% of cases, searchers will have to sift out irrelevant or unwanted data. In 46% of cases, the amount of time spent on searching will be thereby affected. Searches may even have to be repeated. In over 30% of cases, correct items are retrieved but formatting problems mean they are unusable. If these findings accurately reflect reality, then they are worrying.

It is important to emphasise that although users experience problems at all levels of electronic information services, they typically have one point of contact with the service - the service provider. Hosts typically decline responsibility for the contents of the databases they offer, saying that it is the responsibility of the database producers, and that customers should complain to them. Whilst this approach is realistic from the hosts' point of view, it causes some frustration to users who deal with the hosts, not the database producers. At the same time, database producers tend to use waiver clauses whereby they decline all responsibility for any losses incurred by users because of errors and omissions in their databases. Although probably unenforceable, such clauses continue to be employed, and cause users further frustration.

The Internet

The Internet presents its own problems. The main problem is too much information, much of which is often redundant and inaccurate. There is no centralised control on the Internet. Editorial policies, such as refereeing are not yet well established. It is only the individual information providers themselves who decide what information is made available. (Compare this to online or CD-ROM, where the online host or CD-ROM publisher acts as a quality filter, deciding which databases to sell.)

Navigational tools have been developed for the Internet. These include Veronica searching of Gopher sites, WAIS (Wide Area Information Server) searching, and the World Wide Web (WWW). WWW is a hypermedia-based system which can be accessed via user friendly graphical browsers such as Mosaic, Netscape and Mac Web. There are directories of Web sites including those developed by Yahoo and CERN (Centre for European Nuclear Research), and so-called webcrawlers which retrieve information specified in search queries. However there is much duplication between sites. Sites and resources can appear, move or disappear very quickly. Web sites contain information which ranges from the highly significant through to the trivial and the obscene, and because there are no quality controls or any guide to quality, it is difficult for searchers to take information retrieved from the Internet at face value. The Internet will not become a serious tool for professional searchers until the quality issues are resolved.

MEASURING AND EVALUATING QUALITY

Criteria for evaluating quality

"Measuring the quality of databases" was the subject of the Southern California Online User Group (SCOUG) Annual Retreat in 1990. This brainstorming session resulted in a checklist of criteria that could be used as a framework to develop a quantitative method of evaluating database performance. The outcome of the Retreat was reported by Basch in Database Searcher6

The SCOUG criteria for what constitutes quality in databases fall into 10 broad categories: consistency; coverage/scope; timeliness; accuracy/error rate; accessibility/ease of use; integration; output; documentation; customer support and training; and value-to-cost ratio.

A working group for the evaluation of Finnish databases was set up by the Finnish Society for Information Services. The group undertook a database evaluation project that began in 1989. The project focused on the point of view of the professional user. The project's aims were to define database quality, create criteria to measure quality, and carry out an evaluation of some Finnish databases. The project received funding from TINFO, a section of the Finnish Ministry of Education.

The Finnish database quality project also produced a set of criteria for quality assessment. These fall into six broad areas: connecting to the system and communications; search language and other technical aspects of the work; effectiveness of the search programme; content quality; practical aids to information retrieval; and costs.

There are many examples of other criteria for evaluating databases in the literature. Virtually all these criteria mirror the quality issues discussed above. It is important to emphasise that different criteria are important to different searchers, and even the same searcher will have different criteria in mind at different times, depending on the nature of the search. For example, academic end-users and intermediaries searching for such end-users often want comprehensive searches of a subject area. In business, this is less important. The importance of timeliness varies. Intermediaries can often filter out irrelevant material, and have developed sophisticated searching skills. End-users often do not, and need more "user-friendly" systems and guidance.

The professional and trade literature contains reviews of databases and services, but these are not standardised. It is possible to evaluate print sources before purchase by browsing. This cannot usually be done with electronic sources, especially with online services. Benchmarking is a technique which is used to evaluate products and services. This involves comparison between similar services using various criteria. Typical examples include assessing the time taken by the system to undertake an identical search strategy on two different databases, or the date that a particular record entered two competing databases. Benchmarking can be used by database producers to assess their own quality, or can be used by prospective purchasers wishing to evaluate two or more potential databases being considered for purchase. At present, there are no agreed benchmarking tests analogous to the well-established ones used to assess personal computers.

Little work has been done on developing means of measuring and evaluating quality on the Internet. At the moment, it is up to the individuals putting up information to vet themselves, and for intermediaries or users to get to know the "good" sites. Currently, quality can only be measured through reputation; sites with bad reputations will be little used. The Coombs Computing Unit at the Australian National University is carrying out some work in quality issues on the Internet. However this work mainly relates to the design of WWW pages, and not the actual content of the pages. Coombs has set up a page of "quality truisms". Some of these apply to information content, and include advice on providing explanations of purpose and content, charging systems, the importance of avoiding duplication and putting up trivial or erroneous information. Warnings to information providers about copyright violation and advice on setting up mechanisms to allow user feedback are also provided.

Recent key developments

Eusidic

In 1989, Eusidic (The European Association of Information Services) commissioned research into the liability issues affecting the information community. The resulting report covered Europe and the United States and highlighted the fact that the information community could be affected by this issue and be held responsible for its products and services.(7) Eusidic issued a five-part guideline on the updating of electronic databases in 1994. Guideline 1 says that information suppliers should have declared policies on data coverage and modifications and corrections. That policy should be available to everyone using the data. Versions of the data should refer to the policy when accessed. The other four guidelines refer to procedures for updating databases, and methods of communicating these changes to those using the data.

National Federation of Abstracting and Information Services

In response to the SCOUG checklist, a survey was sent in 1991 to 50 database producer members of the National Federation of Abstracting and Information Services (NFAIS). The survey contained questions on current practices related to quality assurance. The SCOUG criteria were used as a basis for the questions asked in this survey of policies and opinions of the database producers.

Association des documentalistes et bibliothècaires specialisés

In the same year the ADBS (Association des documentalistes et bibliothècaires specialisés) in France carried out a study on the under-use of data banks. This led to the compilation of a check-list to be used by database producers before starting out on a project. This document was approved by the ADBS, and another professional association, the Groupement français pour l'industrie de l'information (GFII).

The Library Association and UKOLUG

There has also been much activity in the United Kingdom. The Library Association (LA) and the UK Online User Group (UKOLUG) have been leading the way in the area of database quality. In 1989, the LA set up a Working Party on database quality. In 1991, the LA and UKOLUG held a workshop on database evaluation. The outcome was the setting up of a Database Quality Forum, and then the LA/UKOLUG Task Force on Information Quality. The Task Force issued a statement on the benefits of quality assurance to information suppliers. The UKOLUG Annual Lectures at the International Online Information Meetings (IOLIM) of 1991, 1992 and 1993 had database quality as their theme, and the UKOLUG Annual Conference in 1992 included a section on quality assurance.

Centre for Information Quality Management

The Centre for Information Quality Management (CIQM) was set up with the support of the LA and UKOLUG in 1993. The Centre acts as a clearing-house for quality problems with commercial databases. Users from within and outside the UK are encouraged to report specific problems experienced. Details of problems are forwarded to the information provider, host or publisher concerned and responses are routed back to the user. Statistics are collected which are to be fed back into the information industry. The Centre has received funding from the British Library Research and Development Department and from some information industry players and is working towards the development of a methodology for measuring and evaluating the quality of databases. The CIQM is using the SCOUG criteria in its work. It is working on a method of labelling analogous to that used in the food and drug industries which will allow potential purchasers to evaluate databases in advance. Existing publicity material and fact sheets provided by vendors are seen as being designed to sell the product or service. Labels would have to be factually correct and would include such information as the actual number of items included in the database, subject and geographical coverage, selection criteria, and updating polices. Work is at an early stage, and the CIQM recently held a meeting with industry players to discuss the concept. The CIQM also aims to produce database metrics as a way of evaluating quality such as accuracy, consistency and indexing quotients. NFAIS has been carrying out work on bench marking, and is to collaborate with the CIQM who will use the results of this work.

EQUIP

The European Quality in Information Programme (EQUIP) originated in a Eusidic Spring Workshop that was held in Rotterdam in April 1992. The programme commenced in June 1993. The objective of the programme is to examine and promote the application of Quality Management to the information sector. The project partners are Eusidic itself, GAVEL, a consortium of European information consultants, and the European Online User Group (EUROLUG).

The project "Quality Management in the Information Sector" complements other work being carried out by EQUIP on quality management in the information sector. Its aim is to establish means of measuring customer satisfaction with electronic databases. The opening session of the International Online Information Meeting in 1993 was on the subject of quality. The project emphasises the importance of the information chain and that quality can only be improved with effective communication at all the interfaces. This involves formulating a vocabulary of performance to enable a common understanding between all parties in the information chain.

EUROLUG carried out a questionnaire survey to elicit a response to the SCOUG criteria of quality from users. The results of this survey were reported at the 1993 IOLIM conference. A report of a project involving a set of case studies of organisations who have applied QM principles to elicit their perceptions of total quality was published in 1994 by Eusidic.(9) It was found that an objective form of measurement is needed. GAVEL is working with an evaluation model developed by Zeithaml called SERVQUAL and a hierarchical framework of criteria derived from the SCOUG and Finnish work and discussed with two database producers and one industrial research information service. The aim is to evaluate whether SERVQUAL can be applied to information products and services. This model involves the analysis of "gaps" between the perception of expectations and actual expectations at the various interfaces in the information chain. The project concentrates on the gaps between management perceptions of customer expectations and customer expectations, and the gap between customer expectations and customer perceptions of the service received. Two questionnaires were compiled. One was sent to the database managers of a database producer asking them about their perceptions of their customers views. The other questionnaire was sent to the customers of another database producer asking them about their perceptions of a theoretically "excellent" database and of an actual database. The results of the project were seen to be encouraging, but a need for further work was identified.

Féderation internationale de l'information et documentation

In October 1992, the Féderation internationale de l'information et documentation (FID) decided to set up a Special Interest Group on Quality Issues. The group was launched in March 1993 at a seminar in London. Delegates from 14 countries were present and each gave an account of its national situation.

DG XIII

DG XIII of the Commission of the European Communities funded work on the European Materials Databanks Demonstrator Programme. A set of tests was developed in order to evaluate whether a materials database service could deliver what its producers said it could. In March 1994, a report was prepared for DG XIII/E on the application of quality assurance to information products and services. This report outlined the concept of quality and a way of achieving quality assurance. It included an outline of specifications for information services and products. In June 1994, the Commission held a workshop as part of the IMPACT 2 Programme entitled: Specifying and Measuring the Quality of Information Products and Services. The workshop was an awareness-raising exercise, and the results of studies carried out for the Commission were presented. The workshop provided a discussion forum for some of the major players in the European information industry. At the workshop, proposals for further actions by the Commission and the industry were formulated. The Commission plans further activities under its proposed INFO2000 programme.

Info-filter

The Info-filter Project has recently been set up by librarians to provide timely and accurate reviews of Internet resources. Project participants have identified objective criteria for Internet resource review, and create and revise reviews. These reviews are available on the WWW.

THE INFORMATION INDUSTRY'S APPROACH

Attitudes to quality issues

The trade literature seems to indicate that there is general acknowledgement by information providers that quality in electronic information services is an important issue. Some information providers, for example those involved in providing business information, see quality as being essential for survival. The surveys carried out by NFAIS and GAVEL have tentatively concluded that information providers do understand customer requirements and agree with their criteria. In practice, however, often the opinion of a particular information provider can differ from that of its customers in that they think their databases are high quality when their customers clearly disagree. Discussions at meetings of providers and users such as those at various International Online Information Meetings in London have left users feeling that hosts and database producers will not take responsibility for the databases they offer, and that they sometimes do not understand the needs of their customers.

Quality from the industry's perspective

The search engines of many of today's online systems were developed in the 1970s and are not capable of many of the sophisticated features desired by searchers. The costs of revamping such obsolescent search software can be very high. Nonetheless, increasingly the major hosts are moving from a centralised mainframe approach to a client/server approach. CD-ROM publishers employ more up to date and user-friendly search software, but their products are relatively infrequently updated compared to online services. However, both types of information provider share one view: they see interfaces between the user and the system as a source of competitive edge and have no incentive to standardise them. It is therefore probably pointless for users to place totally standardised search software high on their list of wishes for improved quality.

Levels of technological sophistication of the operations of the various links in the information chain differ. Information is not always available in electronic form; therefore it has to be rekeyed or scanned rather than being input directly into databases. Both rekeying and scanning can introduce errors. Different publishers and database producers have different policies on how data fields are tagged. Even if new standardised policies are introduced for a database, it is likely to be an extremely costly exercise to apply these to the entire backfile of the database.

There is also variation in the sources used by database producers. For example, there is no central register of companies in Germany, and therefore it is hard to track down German company information. In the United Kingdom, small companies are not obliged to file the same level of detailed information as larger companies, or as publicly quoted companies. Adoption of Freedom of Information policies varies across Europe and is not universal. In addition, national governments have varied in their commitment to public/private sector co-operation. As a result of these two factors, the amount of government information available to commercial information providers varies greatly amongst member states. This is symptomatic of the more general problem; information providers do not always have control of their sources in the same way that industrial companies do with their raw materials.

Even though databases may be updated frequently, the content of the updates may not be timely. There are various factors affecting the time it takes for sources to get from the publisher into the database. Database producers may prioritise material, so that some sources are more up-to-date than others. It may take more time for foreign material to get into databases. Different financial year-ends may mean that information on one company is more up-to-date than another.

There are various legal aspects involved in providing electronic products and services. These include license agreements, copyright, and royalties. Various agreements have to be negotiated between all the parties in the information chain. The situation is complicated by differences in national laws, and will be further complicated as multimedia products become more widespread. License agreements between database producers and hosts or CD-ROM publishers invariably include clauses that state that the host or CD-ROM company accepts no liability for the content of the databases. The complexity of the legal issues involved in electronic information provision can cause delays in the addition of material to databases, or the development of new products or services.

The various factors mentioned above have implications for the level of quality that database producers and vendors can achieve.

Actions taken

There is already much collaboration between the information industry and its customers. A number of database producers, for example, have active user groups. NFAIS in the USA, as well as initiating bench marking actions, has agreed to work with the CIQM on an informal basis. Elsevier Science, the producers of EMBASE, will also be working with the CIQM and have pledged financial support during 1995. The FID Special Interest Group hopes to pull together work which has been done in this area.

The information industry is carrying out many improvements in response to their customers. INSPEC have carried out a complete overhaul of their entire database, and now produce occasional correction tapes which contain replacement records for items with known errors. Derwent Information employs an Online Quality Manager, and has carried out a major clean up of its patents database. The new version was loaded onto STN at the end of 1993. Datastream and Infocheck offer money to customers for errors reported by them. Investex is to offer a money back guarantee if users of their MarkIntel market research service are dissatisfied with the reports that they retrieve. Reaction to this has been mixed, both among users and other vendors. NewsNet introduced its FIXIT command, which allows users to send details of problems electronically. Some services will point out errors to users when they log in to the system.

Although discussions between the industry and users such as at the Online meetings in London and at meetings organised by the UK-based City Information Group have been productive, they would seem to suggest that the industry is largely unaware of, or unwilling to implement ISO 9000 and TQM. However, some companies, such as those involved in the EQUIP case studies, have introduced these approaches to quality, or are intending to. Physical Property Data Service has been awarded ISO 9001 quality certification, and has introduced a Quality Assurance System for all the stages in the compilation of the data bank.

The electronic information industry has not, in the past, devoted too much effort into training staff to be quality conscious. One encouraging development is the launch, with funding under the IMPACT programme, of the TRAIN-ISS course. The course leads to a qualification in Electronic Information Management, and is primarily aimed at students from the Less Favoured Regions who wish to enter the information industry. It is hoped that guidelines can be created to enable similar Europe-wide courses to be established. The course covers all aspects of the electronic information industry, and includes quality issues in its curriculum.

To sum up the information industry's view of quality: it recognises that 100% perfect databases are impossible to achieve; the key question is how to achieve a reasonable level of quality without excessive expense. Many companies have introduced computerised validation of entries, and some are starting to use Expert Systems for automated indexing or validation and SPC methods for reducing errors. Each database producer will assess its own priorities for accuracy, timeliness, etc. in the light of the competition and of user demands, and will make its own judgement regarding the level of investment needed in order to achieve these priorities.

THE FUTURE

Users who are seeking certification under ISO 9000 or are introducing TQM are obliged to investigate their suppliers' quality standards. Users are sometimes able to go elsewhere if they are not happy with a service, and the reputation of the offending company may suffer as a result of informal word of mouth comments. A general lack of trust among customers is not good for the industry and could inhibit its growth. The potentially huge end-user market is an opportunity for commercial information providers, but in order to achieve this mass market they must listen to what potential customers want, and provide the sort of service they expect.

A common complaint from the industry about ensuring quality is its cost. A paper given at the EC-sponsored workshop on quality by Herget claims that this notion is untrue and that, in fact, lack of quality incurs additional costs.(10) Quality involves fulfilling customer expectations. It costs more to win a new customer than to retain an existing one, and satisfied existing customers attract new ones. In the long-run, it is claimed, the costs incurred by taking preventative measures are less than those for repairing defects. On the basis of two case studies, some empirical evidence of cost savings is given.

The liability issues are pertinent to information providers. The validity of the total exclusion clauses used by most electronic information providers in their contracts with customers is doubtful. In other words, such exclusion clauses would be unlikely to help a data provider who was sued for damage caused by inaccurate information if it was shown that the data provider had been reckless or had inadequate quality controls in place. If strict liability were to come into effect in the future, such exclusion clauses would be clearly completely invalid. Therefore, it is in the interests of information providers to develop systems to ensure the reasonable accuracy, completeness and currency of the information they provide.

There is a clear implication, therefore, of a need for the industry to establish a much closer communication with users in order to establish their priorities, and for a set of guidelines on the main features of quality to be agreed amongst major players in the information industry. There is also a need for users to co-operate more amongst themselves to establish guidelines for assessing quality.

There has been much discussion in the professional literature regarding so-called "intelligent agents" or "knowbots". These are software tools, combining information retrieval algorithms with expert systems, to actively seek out information of relevance to users and to make quality judgements on that information prior to presenting it to the user. The first of such intelligent agents are already available commercially; examples include Desktop Data's Newsedge and Individual Inc's First! product. These services scan incoming news wires and other databases for items of relevance for the patrons. In the future, knowbots will not simply passively wait for updated data feeds, but will pro-actively seek out data of relevance and deliver it to their patrons from online databases and from Internet sources. They will learn from their user's relevance judgements to assess the validity of information before it is presented to the user. In that way, many of the quality judgements will already be taken by the knowbot and the user will be less bothered by quality issues. However, this will not reduce the quality problems in themselves, in that the nature of errors is that they are unpredictable and random and a rule-based system such as a knowbot cannot reduce them, it may simply select them out. In addition, truly intelligent "knowbots" are many years off, and in the interim, database producers and online hosts do need to address the issues for themselves.

As awareness of quality issues amongst information professionals grow, their demands for high quality databases and their reluctance to accept all-embracing exclusion clauses will increase. There is a clear need for the industry to respond to these requirements; this also represents a genuine market opportunity. If the European information industry can establish itself as the market leader in high quality electronic information, it can develop world-wide in a manner similar to the Japanese car manufacturers.

CONCLUSIONS

There is some indication that users feel that quality is improving in the information industry, but there is enough evidence of dissatisfaction and misunderstanding between the information industry and its customers to warrant increased communication between them.

If information providers feel that customer expectations are too high, then they need to explain to customers the problems they face. Indeed, as stated earlier, management of customer expectations is one of the key ingredients of a quality strategy.

The users have to communicate their requirements and concerns to the industry. Collaboration is needed to develop specifications for commercial services that can be measured and evaluated, and which are acceptable both to the information providers and the requirements of their various customer groups.

More empirical evidence is needed of the financial benefits of applying quality procedures in the information industry.

There also needs to be collaboration within the information industry to increase the level of harmonisation of the input into information products and services and to ensure quality at all the interfaces in the information chain.

If the Internet is to play a significant role in the provision of commercial information services, improved navigation tools and widely agreed methods of evaluating the quality of Internet sites are essential.

The issue of quality is a challenge for the information industry which represents both a threat and a market opportunity. Failure to address some of the current concerns of the marketplace could lead to a decline in market position for even well established players. On the other hand, establishing high quality standards could potentially open up major market segments outside Europe. In short, quality could be an issue that makes a significant contribution to the growth and development of the European electronic information industry.

RECOMMENDATIONS

Application of standards for the European information industry

The EC has a role to play in quality issues within the European information sector. It can act as a catalyst, and has already done so through awareness-raising exercises such as its quality workshop. A priority under INFO2000 will be the encouragement of the use of standards. European and world-wide standards already exist for the management of quality in organisations. At the EC's workshop on quality issues, it was suggested that the CEN should be involved in the formulation of standards for the European information industry. One possibility would be the development of a widely understood system of service levels similar to that used on defining the quality of restaurants and hotels. Whilst it is clearly much easier to define standards for a hotel than for a database, labelling standards for databases are by no means impossible to achieve.

Increased cooperation between industry players and users

There is work being carried out on specifications for information products and services, but this is uncoordinated and at an early stage. Further work is needed on the key criteria for evaluating quality. The industry could work with bodies such as the Centre for Information Quality Management to investigate further the feasibility of a database labelling system. Pan-European professional and industry groups could cooperate more. The European Information Industry Association (EIIA) represents the European information industry, and Eusidic bridges both users and the industry. The EC has already funded some projects and could also act to bring these groups together.

Further harmonisation of laws affecting the European information community

The EC has a further role to play by continuing its programme of harmonisation of laws which affect the information community. Work has been carried out on copyright and data protection. Further action is needed to harmonise the reporting and availability of company information, and on a Directive for liability for service provision.

Better navigational tools for the Internet and role of information professionals

More research could be carried out on the development of better navigational tools for the Internet. Information professionals are already taking a role in the filtering of information resources on the Internet, this role could be further developed.

Positioning of market players with regard to quality

Quality is an issue that is not going to go away, and indeed, its importance is increasing. Industry players could position themselves more explicitly in the market with regard to quality and use it as a marketing tool. The willingness of customers to pay more for increased quality of information in certain areas could be investigated. Although users in the main recognise that the odd error might creep into a database, they expect that the services they pay for should be largely error-free and would probably be unwilling to pay extra for a totally error free service. They may, however, be willing to pay more for timeliness, for example.

REFERENCES

1. Swindells, N. Managing the quality of information products. Managing Information, Oct. 1995 (in the press)
2. Ellis, D. and Norton, R. Implementing BS 5750; ISO 9000 in libraries. Aslib, 1993.
3. Quality and libraries. Library Association, 1994.
4. European Commission. Growth, competitiveness, employment: the challenges and ways forward into the 21st century. White Paper. Luxembourg: Office for Official Publications of the European Communities, 1994.
5. Europe and the global information society: recommendations to the European Council. Brussels, 26 May 1994.
6. Basch, R. "Measuring the quality of data". Report of the Fourth Annual SCOUG Retreat. Database Searcher, 6(8), 1990, pp. 18-23.
7. Denis, S. et al. Liability in the provision of information services. Eusidic Research Project 1989. Eusidic, 1990.
8. The Internet and the European information industry. IMO Working Paper 94/3. Luxembourg, September 1994.
9. Lester, D.E. The impact of quality management on the information sector: a study of case histories. Eusidic, 1994.
10. Herget, J. The cost of (non)-quality - why it matters for information providers, in, Proceedings of the workshop: Specifying and measuring the quality of information products and services. Luxembourg, 1994.

NOTE:                                                                              

The present document and all the previous IMO reports and working papers are now available in HTML format on the I'M EUROPE World Wide Web server URL (http://www.echo.lu/). The documents are located on the IMPACT home page under the IMO section.

Enquiries should be e-mailed to webmaster @echo.lu.


LIST OF ABBREVIATIONS

ADBS Association des documentalistes et bibliothècaires specialisés
CDROM Compact Disc Read Only Memory.
CEN Comité européen de normalisation.
CERN Centre for European Nuclear Research
CIQM Centre for Information Quality Management
DG XIII Directorate General for Telecommunications, Information Industries and Innovation
DGD Deutsche Gesellschaft für Dokumentation
EIIA European Information Industry Association
EQUIP European Quality in Information Programme
EUROLUG European Online User Group
Eusidic The European Association of Information Services
FID Féderation internationale de l'information et documentation
GFII Groupement français pour l'industrie de l'information
IMPACT Information Market Policy ACTion Project
INSPEC Information Service for the Physics & Engineering Communities
ISO International Standards OrganizationLA Library Association
NFAIS National Federation of Abstracting and Information Services
SCOUG Southern California Online User Group
STN Scientific and Technical Network
TQM Total Quality Management
UKOLUG UK Online User Group
WAIS Wide Area Information Server
WWW World Wide Web

Home - Gate - Back - Top - 9504 - Relevant


webmaster@echo.lu