Reading:  Assessing Information Needs

 

This reading is based on and, at times uses exact text of, Gustafson DH, Cats-Baril WL, Alemi F. Systems to Support Health Policy Analysis: Theory, Models, and Uses, Health Administration Press: Ann Arbor, Michigan, 1992.

Introduction

This section addresses how information systems could be tailored to serve strategic decisions.  Organizations face three types of decisions:

  • Operational decisions. These decisions are made by workers and their supervisors and are concerned with daily production.
  • Managerial decisions. These decisions are made by mid-level managers and are concerned with topics such as hiring, and motivating employees. Strategic decisions. These are different from operational and managerial decisions.
  • Strategic decisions are made by organizational leaders and are concerned about the mission and the re-organization of a firm. These decisions tend to be more unstructured, involve more searching of the environment and tend to occur less frequently than either operational or managerial decisions.

Strategic Information Systems help policymakers, executives, and planners decide on organization mission and strategies. Organizations make many different strategic decisions. Sometimes organizations make these decisions consciously, as when the firm engages in strategic planning. Other times decisions are made without being aware of the strategic importance of these decisions, as when the environment forces the organization to act in certain ways. It is difficult to know what issues will be strategically important for the organization. One way to understand organization's strategic issues is to look at data needed in previous planning efforts. Unfortunately, history is often not a good guide of future strategic issues. Because information systems help in articulating strategic issues, if the information system focuses on the wrong issue it could radically mislead the organization. How can we anticipate information needs of organizational leaders? The answer to this question is the focus of this section. Though most readers think of systems as information inside computers, this need not be the case. In thinking about Strategic information systems, we must also consider non-computerized sources of information, e.g. reports and commissioned studies, experts' advice, and informal communication networks. One should also think through the timing of information. In particular, should the analyst collect and analyze the information hoping that it will be used or should the analyst wait for organizational leaders to articulate the need and then collect the information. . In addressing the question of timing, the following two tradeoffs should be considered:

  • Relevance versus timely availability. Data collected and analyzed after the need arises are more likely to be relevant to the decision-maker’s task, but because of delays in collecting information data may not be available when needed. Periodic or continuous data gathering. Collecting data as the need arises allows analysts and decision-makers to define precisely what must be collected. Regular data collection often leads to data categories that are too narrow or too broad.
  • Periodic collection - collecting data only when the need arises - allows us to assemble the most appropriate data but may not give us a basis for comparing trends.

By its very nature, building a strategic information system requires us to think through not only what is needed but when is it needed and how can information be organized.  The analyst must specify the information needs, the data collection strategies and the analysis needed.  At the same time, the temptation to collect data on every conceivable contingency must be tempered by recognition that collection is expensive. In practically every case, the minimum should be done.

What do you need? What do you want?

It’s not an easy challenge to design a Strategic Information System to meet the information needs and expectations of organizational leaders. Various obstacles prevent simple determination of information requirements. Dubois et al. (1982) have suggested three categories of difficulty in the determination of information requirements:

  • A well-defined set of requirements does not exist or is unstable.
  • The organizational leaders are unable to specify requirements.
  • The analysts are unable to elicit those requirements and/or evaluate them for correctness and completeness.

The problems that users have in specifying their own needs can be traced to general cognitive limitations of all human beings. For example, people, even experts, are not aware how they make decisions and what information they use in those decisions. People think, but often do not know how they think. The process of thinking for the most part is an unconscious process. Thus, when you ask someone how did they make their decision, they may list a number of pieces of information that they claim they paid attention to but in reality they may have made their mind up based on a much smaller set of information. As a consequence of this cognitive limitation, clients often will not differentiate true information needs from wishes (Tagart and Tharp 1977). When asked what do you need, they present a long wish list; including items which they do not use even if the information was available.

Another reason why organizational leaders may not be able to articulate their information needs is because they may have forgotten occasions in which they needed more information. When clients are interviewed about their needs, they have to remember how key information was missing in the past decisions. This is very unpleasant. No one is in a habit of remembering his or her failures or episodes in which their needs were not satisfied. Unpleasant events are often forgotten, especially when you, yourself, are in charge. Interviewing organizational leaders about failures of their organizations to supply critical information is akin to asking you when did you make bad judgments. It is an unpleasant task that you rather forget. For some people it is so unpleasant that they may unconsciously distort the facts so not to remember the events. If managers and leaders forget their failures, if they prefer not remember how they failed to gather necessary information, interviews will not be very useful. If the manager has a good memory for these instances, the analyst will have a good basis for specifying the information requirements.

Various other severe and systematic cognitive biases also impair the ability to define information needs (IBM Corporation 1981). But it is not just cognitive limitations that create havoc in assessing information needs. Clients’ lack of expertise and knowledge is a also a common obstacle to formulating and stating true needs (Ellis 1982; Hogarth 1981). Many decision makers do not realize how their own needs change over time and how these needs are affected by external events. Many are not aware of new technological possibilities and keep projecting future needs based on their existing expectations from current information systems. In short, many do not know what they want.

Finally there is the problem of distinguishing between what clients' want and what they need. Sometimes organizational leaders get what they want but not necessarily what is good for the organization. Information systems cannot be serving the needs of individuals and ignore the needs of organizations. Sometimes individuals' cognitive styles prevent them from examining information presented in different formats. Clearly, information systems should help individuals see beyond their own limitations. Sometimes differences in time horizons, personal career objectives, and internal organization politics prevent some organizational leaders from seeing the emerging changes around them. Information systems cannot rubber stamp the status quo. They need to engage leaders in thinking about the emerging future and assist them in making it a reality. Information systems are intended not only to serve clients but also to enhance and improve their decision making. By focusing solely on what clients want information systems may do them a disservice.

This is not to suggest that analysts can identify users’ needs better than clients themselves. Analysts who do not involve clients risk the possibility of planning systems that are not used. Information system analysts are often ignorant of the subject matter. Planning without engaging the client may lead to wasted effort.

We suggest that information systems should play both roles: give clients the information they are asking for but add to it information they may need but not have asked for. To do so, analysts and clients must collaborate to determine information needs. How could this be done?

Previous Approaches

Taggart and Tharp (1977) Yadav (1983) and others have suggested different approaches to improve our ability to understand information needs:

  • Analyze organizational tasks and see how information is used in these tasks (Hira and Mori 1982; Mintzberg 1975).
  • Ask the decision-maker about their needs (Huysmans 1970; Ross and Schoman 1977). 
  • Deriving requirements from the existing information system (Valusek 1985).
  • Looking at strategic goals and concerns (Checkland 1981).
  • Do input-process-output analysis (Lundeberg 1979).

Are some of these approaches like the ideas you presented earlier?  All of these approaches have features that are appealing; but each suffers from one or more of these weaknesses:

  • A focus on past and current but not future information needs. This may be acceptable in a stable environment, but not in a dynamic one.
  • A focus on a single issue, task, or decision. The resulting information system could be useful in a highly repetitive operational setting but would be too narrow to deal with the diverse set of issues faced by most organizational leaders.
  • A focus on the information as opposed to decisions. This technique may specify information that are not used in any specific decisions.
  • A focus on the decision-makers’ personal goals, which may be inconsistent with the organizational goals.
  • A focus on observing users’ behavior without seeking their insight or helping them examine their expertise to assess needs more creatively.  The more clients participate in the design of information systems, the more likely that they will see the project to the end and use the resulting information system.

Until now, Information requirement techniques have emphasized structuring needs assessments into formats compatible with computers, not on finding mechanisms to formulate needs from the users’ perspective (Gustafson et al. 1982). While a few user-oriented information requirement tools are described in the literature (King 1978; Mitroff 1974), the overwhelming majority of tools developed in the past decade concentrate on structuring and representing the needs expressed by users rather than on helping users formulate their needs more effectively.

Recommended Methodology

This section presents an alternative method of determining information requirements, Gustafson, Alemi and Cats-Bariel (1992) developed to overcome the above weaknesses. The methodology has the following steps:

    1. Identify future issues. Invite a panel of internal leaders and external experts that are familiar with the organization as well as with the environment of the organization. Ask the panel to identify strategic issues that the organization will face within the next 2-5 years.
    2. Identify information needed for selected issues.  Ask the panel to specify the information needed for the most likely and most important issues.  Create a non-redundant list of information needed across all issues examined..
    3. Rate the importance of the information for different issues.  Ask the panel to set priority for information needed to address different issues.
    4. Allocate resources to govern the collection and analysis of information on the basis of the new priorities. Identify information that needs to be collected routinely, information that needs to be computerized, and information that may be collected when the need arises in the future.

Following Table shows who completes the various tasks.

Step

Objective Performed by

1

Identify issues Panel of experts and organizational leaders

2

List information needs of top issues Panel of experts and organizational leaders

3

Create a taxonomy Systems analysis team

4

Prioritize information items Panel of organizational leaders

5

Develop resource allocation plan for data collection and analysis strategies Systems analysis team

There are many advantages to the proposed approach. 

  • By forcing clients to think about future issues, the planning effort reduces the danger of focusing on current issues and information needs.
  • This type of "issue driven" methodology also minimizes the potential for collecting data that may not be used in any strategic decision. Directly asking clients what information they need may lead to items that seem important but are not crucial for any major upcoming decisions.
  • By relying on a group rather than an individual, the methodology minimizes the cognitive and behavioral limitations of having a single person define information needs.
  • By adding external experts to the membership of the group, the methodology emphasizes what people in the organization want and what people outside think they may need.

Each step is discussed below and illustrated by a case study that shows how information needs will be set for a mental health commission charged with setting manpower policy.

Step 1. Identify future issues

The first step is to identify and prioritize issues likely to confront the decision-makers served by the information system. A number of mechanisms can be used for anticipating strategic issues. Techniques like the Delphi, Nominal Group or Integrative Group Process have been used to capture the opinions of groups of experts on the likelihood that issues will occur. Systems analyses (to identify weaknesses in system operation), literature reviews, and observing the environment can also be helpful. The identification process should result in a list of strategic issues along with priorities that reflect both the intrinsic importance (i.e., magnitude of the consequences) of the issue and the likelihood it will become salient and thus require a response.

In the case study reported here, 48 experts in mental health policy from several states participated in a three-month long Delphi study to identify and set priorities on key issues likely to face the mental health field in the next five years. The experts rated the issues in terms of probability of occurrence and intrinsic importance. The importance and probability estimates were multiplied to yield an expected value score for each issue. Later, the 37 members of a commission responsible for setting mental health manpower policy in one state (the clients of the future information system) selected from that list six issues best reflecting the goals of their commission. For illustrative purposes, three of the six issues selected were the following:

  1. How to reimburse the growing number of paraprofessionals who treat mental health clients?
  2. What educational programs or policies should the state adopt to upgrade knowledge of mental illness among primary care physicians who deliver many services to rural, mentally ill clients?
  3. Many chronically mentally ill patients live in nursing homes that cannot provide adequate care. What policies should the state adopt to improve care for these patients?

Step 2. Develop a taxonomy

When one examines information needed across different issue, time after time the same pieces of information may show up. For example, cost of operations maybe important in several different upcoming issues. In this step, the analysis team works with the panel to identify categories of information. The panel lists for each issue what information will substantially reduce uncertainty and assist in decision making. The analysis team then categorizes these information items into distinct and non-redundant list of items.

For example, in the specification of information requirements for the mental health commission, we conducted a second three-month-long Delphi study. We asked the panel to identify information items that would substantially reduce uncertainty about how to deal with key issues. After these the panel came to a consensus about the needed information items, the analysis team eliminated redundancies and combined the responses into a single list. Many information items appeared in all issues examined. At the same time, each issue had one or more information items that were unique to the nature of the issue.  The analysis team was able to consolidate the responses into a list of 69 generic information items. Here are two of the 69 items:

Item number

Item description

1

The size and character of the patient population affected by the issue

5

How expensive is it to provide services that fully meet clients’ needs?

Step 3. Rate the importance of information requirements

No information system can collect and analyze data to address every possible information requirement. Even if one could guarantee that the format in which the data were collected was appropriate to a particular strategic issue, the cost would be prohibitive. It is imperative to establish priorities on what information is to be collected and analyzed.

In step 4, organization's leaders rate the importance of each information item for specific strategic issues. The panelist maybe asked a question such as the following:

For the issue "xxx," assign a score between 1 and 3 to each information item reflecting the need for the information. A score of 1 means the information has low priority, a score of 2 means that the information is nice to have but not essential and a score of 3 means that the item is essential for addressing the issue.

When finished, the information items rated highest across all issues are considered most responsive to the organization’s needs.

For example, each mental health commissioner was asked to choose the three issues on which they were most informed. Each commissioner rated the relevance of all 69-information items for their issue. Then a priority score was calculated by averaging the scores given for each item by all panelists on all issues. For example, the item "how clients differ from each other?" had an average score of 1.8 (out of a possible 3) across all issues for all raters.

Step 4. Prepare a data collection / analysis plan

Listing what information is needed does not finish the information requirement specification. Further steps must be taken to specify how and when the information will be collected. A plan should be devised for data collection. Not all information items should be collected immediately. Nor, should all information collected be computerized. In specifying data collection plans, we believe it is important to ignore the following common pitfalls:

  • The all or nothing style of including or excluding information items, which omits intermediate options. It is important to realize that the importance of information may change over time and that data collection plans must reflect this changing nature of information.
  • The separation of data collection from analysis - as if the value of data is independent of your analysis. It is important to plan for both the data collection and the analysis of the data.
  • The assumption that only empirically observable events are data, and that experts’ opinions are irrelevant to policy decisions. We should plan the collection of both qualitative and quantitative, numerical and text and concrete and vague data.

In planning for data collection, it is useful to divide information items into four categories:

  1. Essential information. This set includes items important to all issues.
  2. Rapid collection items. This set includes items important to most issues. For these items one plans a capacity to collect the information rapidly if the need arises.
  3. Periodic collection items. This set includes items important on some issues.
  4. Low priority items. This set is ignored.

The decision on how to classify the information item is made by paying attention to the average rating and the range (the difference between the lowest and highest ratings) for the information items. When the range is small it suggests that there was a consensus on the importance of the information item across different issues. When the range is high it suggests that for some issues the information item is important and for others it is not as important. The following Table shows how the information ratings are used to classify the data collection effort:

  Small range of ratings across issues Wide range of ratings across issues
High average rating of importance Essential information. Collect now. Rapid data collection set. Plan data collection now.
Low average rating of importance Low priority items Periodic data set. Collect when needed

The essential information set. Items in this set have high average importance and a small range (denoting uniform importance across issues). These items should be collected and analyzed regularly so policymaker can access them immediately when needed. In this case study, the essential information set included 14 items. One of which was the following item:

Item number

Item description

5

How expensive is it to provide services that fully meet clients’ needs?

The practice of routinely providing essential information set to organizational leaders has interesting resource and information system implications. For example, some items require systems and decision analyses not typically found in computerized information systems. Item 5 above, for example, requires modeling and simulation, not just simply reporting the data. Hence, the character of the information system needed here is likely to be quite different from typical systems that provide only data. The information system must collect, analyze, and report data items in the Essential Information set routinely. This means that provisions should be made for analysis of and not just collection of the data.

The rapid collection and analysis set. Items in this set are, on the average, only slightly less important than those in the essential information set, but they have a larger range, indicating that their relevance across issues varies substantially. These information items are likely to be needed on short notice for some but not most issues. Rather than collecting (and analyzing) data that may never be used, the best use of resources here may be to construct a data collection, modeling, and analysis system that can be quickly activated. The system would identify who will collect the data; where, from whom, and within what period it will be collected; and what instruments will be used to do collecting. The models and files for analyzing the data should be specified, and personnel qualified to implement the analysis should be identified. Budget should be set for completing the work, though no work should start. In the mental health example presented earlier, we selected this rapid collection data set:

Item number

Description of the item

1

The size and character of the patient population

13

The number, type, and distribution of services provided

42

The organization of the existing system

45

The monitoring and evaluation methods used with the system

47

Options to the existing system

53

The costs of implementing alternatives to the existing system

61

Legislative and regulatory changes needed to alter the system

This set includes two types of information. Items 2, 13, 42, and 45 require more elaborate description of the existing system.. Items 47, 53, and 61 deal with alternatives to the existing delivery system. The organization should plan the data collection and analysis for addressing these information items but should wait and see if the need for the information arises.

The periodic information set. The information items are unimportant for most policy issues but vital to a few. The items cannot justify regular data collection or analysis, and might not even justify detailed planning. The best strategy may be to allocate resources for special studies (perhaps conducted by vendors). In the case study, the periodic information set consisted of the following items:

Item number

Description of the information item

7

Who gets what priority for different types of care

26

How different providers serve the undeserved areas

28

Constraints specifically limiting substitutability of providers

29

The existing provider certification and education systems

31

Training that might be needed by families and the community

35

Possible funding sources and methods of obtaining funds

56

Legislative/regulatory constraints on the existing/optional systems

57

How legislation defines responsibility for the client population

66

How and whether to mobilize power groups

67

How the community is affected by the care delivery system

 

Many of these (28, 31, 35, 56, 66, and perhaps 7 and 29) seem relevant primarily if a change is planned in the system, so the nature of the information needed would depend on the options under consideration. The information required is not likely to be present in a typical information system. In addressing this set, the information system should find resources to conduct quick response studies that draw on the judgments of experts rather than empirical data.

The low-priority set. Information items in this set have low average importance and moderate to low range and would only rarely be used in most policy analyses. Even when they are used, they would not typically be crucial to the analyses. We suggest that no planning or preparation be directed toward these items. The following items are examples of the low-priority set:

Item number

Description of the information item

4

Episodes of illness: frequency and responsiveness to treatment

9

The role of the family in caring for client

46

What knowledge is needed to correct the problem

58

How existing laws and regulations duplicate each other

64

The position, attitude, and agenda of affected interest groups

Items 4, 9, 46, and 58 require a level of detail that may not be needed in the selection and design of issues considered by the panel, although they may be important during implementation. It surprised us that item 64 received such a uniformly low score because interest groups typically have a great effect on strategic issues. Possibly policymakers felt they already knew the positions of these groups and therefore needed no further information. Periodic examination of information requirements may suggest a change in the status of information items. Information considered low priority may change to higher priority as the issues or the environment changes.

The boundaries between set should be drawn according to the availability of resources to collect and analyze the data, and by comparing the cost of collecting data in advance with the cost of collecting it on as needed basis.

Discussion

The methodology we have discussed provides a means with which to identify a core of information items that will support the analysis of a wide range of issues. It also provides a means of determining which items to collect and analyze and what priority to assign them. While some data should be routinely collected and analyzed, in other cases resources should be invested not in doing but in planning the collection and analysis. Finally, this analysis suggests that much of the information can come from the minds of experts rather than from observing and documenting events or facts.

The methodology requires that the analysis team identifies and gains access to a group of experts who will generate and update a list of future issues or potential problems in the field. The selection of experts is crucial since their collective vision will drive the design of the information system. It is also important that the panel be available on a regular basis (the frequency depends on the degree of environmental turbulence) to determine which issues are emerging and whether they require additional items of information. It is our experience, however, that the generic taxonomy tends to be highly robust and that, while issues come and go, the taxonomy of types of information needed seldom changes.

The average importance and range of items, on the other hand, may change, so they should be reassessed periodically. For example, in this era of tight budgets, information items that deal with costs, resources, and funding are usually important. And because the boundaries of the four information sets depend on resources, changes in resources for collecting and maintaining the information systems will alter the boundaries. A loss of resource indicates that we should adopt a stricter definition of importance, so the low-priority set would grow at the expense of the other three.

Certain refinements could improve the accuracy of the ratings. For instance, it might make sense to use an importance rating scale with more benchmarks (e.g., 1 to 7 instead of 1 to 3) and to use the variance rather than the range to classify the items. Also, a different subset of issues and a different set of panel members could be used to test the stability of the ratings.

The principal idea behind this methodology is to create in advance a flexible information system that includes a core of critical data, an ability to rapidly access sources of other data as new issues arise, and the resources to "hire out" special studies in rare cases.

References

  • Gustafson DH, Cats-Baril WL, Alemi F. Systems to Support Health Policy Analysis: Theory, Models, and Uses, Health Administration Press: Ann Arbor, Michigan, 1992.

  • Checkland, P. 1981. Systems Thinking, Systems Practice. New York: John Wiley.

  • Dubois, E., J. P. Finance, and A. Van Lamsweerde. 1982. "Towards a Deductive Approach to Information System Specification and Design." In Requirements Engineering Environments, edited by Y. Ohno. Tokyo: Ohmsha.

  • Ellis, H. C. 1982. "A Refined Model for Definition of System Requirements." Database Journal 12 (3): 2-9

  • Gustafson, D. H., and A. Thesen. 1981. "Are Traditional Information Systems Adequate for Policymakers?" Health Care Management Review (Winter): 51-63

  • Hira, H., and K. Mori. 1982. "Customer- Needs Analysis Procedures: CNAP." In Requirements Engineering Environments, edited by Y. Ohno. Tokyo: Ohmsha.

  • Hogarth, R. 1981. Judgement and Choice. New York: John Wiley.

  • Huysmans, J.H.B.M. 1970. "The Effectiveness of the Cognitive Constraint in Implementing Operations Research Proposals." Management Science17 (1): 92-104

  • IBM Corporation. 1981. Business Systems Planning- Information Systems Planning Guide, Application Manual, GE20-0527-3.

  • King, W. R. 1978. "Strategic Planning for Management Information Systems." MIS Quarterly 2 (1): 27-37.

  • Lunderberg, M. 1979. "An Approach for Involving Users in the Specifications of Information Systems." In Formal Models and Practical Tools for Information Systems Design, edited by H.J. Schneider. Amsterdam: North Holland.

  • Mason, R.O., and I. I. Mitroff. 1973. "A Program for Research on Management Information Systems." Management Science 19 (5): 475-87.

  • Mintzberg, H. 1975. "The Manager’s Folklore and Fact." Harvard Business Review 53 (4): 49-61.

  • Mitroff, I., and T. Featheringham. 1974. "On Systematic Problem Solving and Error of the Third Kind." Behavioral Science 19 (3): 383-93.

  • Munro, M. C., and G. B. Davis. 1977. "Determining Management Information Needs- A Comparison of+ Methods." MIS Quarterly 1 (2): 55-67.

  • Rockart, J. F. 1979. "Critical Success Factors." Harvard Business Review 57 (2): 81-91.

  • Ross, D. T., and K. E. Schoman. 1977. "Structured Analysis for Requirements Definition." IEEE Transactions on Software Engineering SE-3 (1): 6-15.

  • Taggart, W.M., and M. O. Tharp. 1977. "A Survey of Information Requirements Analysis Techniques." Computing Surveys 9 (4): 273-90.

  • Valusek, J. R. 1985. "Information Requirements Determination: An Empirical Investigation of Obstacles Within an Individual." Unpublished doctoral dissertation, University of Wisconsin-Madison.

  • Wack, P. 1985. "Scenarios: Uncharted Waters Ahead. "Harvard Business Review 63 (5): 72-89.

  • Yadav, S. B. 1983. "Determining an Organization’s Information Requirements: A State-of-the-Art Survey." Data Base 14 (3): 3-20.


This page is part of the course on Information Systems.   This page was edited 05/16/2013 by Farrokh Alemi, Ph.D.  ©Copyright protected.