Home Page
Search
Site Contents


The Anti-Virus Strategy System


Sarah Gordon
[email protected]

© 1995 Virus Bulletin. This document may not be reproduced in whole or in part, stored on any electronic information system, or otherwise be made available without prior express written consent of Virus Bulletin.

Abstract

Anti-virus protection is, or should be, an integral part of any Information Systems operation, be it personal or professional. However, our observation shows that the design of the actual anti-virus system, as well as its implementation and maintenance, can range from haphazard and sketchy to almost totally nonfunctional.

While systems theory in sociological disciplines has come under much attack, it has much to offer in the management of integration of technological applications into daily operations. We will examine the 'anti-virus' strategy (Policy, Procedure, Software [selection, implementation, maintenance]), focusing on areas where the 'system' can fail. We will address this interaction from a business, rather than a personal computing, point of view.

The Anti-Virus Strategy System will examine anti-virus strategies from a Holistic General Systems Theory perspective. By this, we mean that we will concern ourselves with the individual parts of the system, their functionality, and their interaction. We will draw from various IT models specifically designed to provide a holistic, forward-thinking approach to the problem, and show that for our strategy to flourish, we must concern ourselves with the system as a whole, not merely with its individual components.

Return to Top

Introduction

Computer virus. System failure. These words bring to mind a computer system brought to its knees - data corrupted and time wasted. Is this an accurate picture? We hear arguments against investing in virus protection: 'Viruses are mythical. Your chances of getting hit by one are pretty rare.' Others tell us anti-virus software is a necessity: 'Viruses can cost your company a lot of money. Better safe than sorry.' What are we to believe?

Let's assume that you don't have any anti-virus software. If you are 'hit' by a virus, the cost will be proportional to the value of your data and the value of your time. Independent studies [1] have shown that this cost can be quite high, depending on these factors as well as environmental factors such as how many computers you have (Note: If your data is of little or no value, and if your time is worthless, then you can well afford not to have an anti-virus strategy).

We will assume here that your data is worth something to your company, and that your time also has a significant value. In this case, you will want to protect your computer system from viruses. We will concede for the purists among us that not all viruses are intentionally harmful, but stipulate that intentional harm is not requisite for actual harm. For our purposes, allocating disk space and CPU time and/or modification of files without knowledge and consent (implied or otherwise) constitutes damage, as do deliberate or unintentional disruption of work, corruption of data and the lost time mentioned earlier. Basically, we are saying viruses are bad and we want to protect against them (there may be some wonderful new virus out there in development that can help us, but that is beyond the scope of this paper).

Fortunately, we are in luck. The very thing we need already exists: software, which will detect 100 percent of viruses listed by the Wildlist [2] as being known to be in the wild. In tests run against a library matched with the Wildlist, several programs were capable of detecting all such viruses. The necessity of detection of 'lab' viruses is another matter, and will not be covered at this time, although it is addressed in [3].

Since we have such software, we should have no problems. However, there are problems. Something is wrong. Before examining the sources of the problem, a few comments on definitions we will be using are in order.

Return to Top

Definitions

The definitions used here are pretty generic, and are adapted for use in an interdisciplinary approach to the problems addressed. Some among us would argue that the systems movement was born out of science's failures [4], but in this paper, we take the view that General System theory is a child of successful science, and as most children, it sees things through optimistic eyes. We have specifically avoided in-depth discussion of categorical schemes, generalizations, and other commonly used 'tools' of General Systems thought, and have focused instead on the simplest of the simple. The ideas in this paper are drawn heavily from very basic works in systems theory. They are not new ideas, but it is our hope that their application to the management of security and computer viruses will help us identify some of the problems we may be overlooking.

Return to Top

General Systems Theory

A system is a set, or group, of related elements existing in an environment and forming a whole. Systems can be made up of objects (computers), subjects (your employees) and concepts (language and communication); they can be made up of any one or more of these elements. There are 'real systems' (those which exist independent of an observer), and 'conceptual systems' (those which are symbolic constructs). Our system, 'The anti-virus strategy system', is not so different from many others, in that it is composed of all three elements: computers (objects), people (subjects) and concepts (policies and ideas). Each of these systems has its own subsystems. For example, your system of networked computers consists of individual computers. These computers are comprised of yet more subsystems; microprocessors, resistors, disk drives, etc. Our system consists of both real and conceptual subsystems. A system can also be said to be a way of looking at the world, or a point of view [5].

Concepts, laws, and models often appear in widely different fields [6] based upon totally different facts. This appears to be at least in part due to problems of organization, phenomena which cannot be resolved into local events, and dynamic interactions manifested in the difference of behaviour of parts when isolated or in higher configurations. The result is, of course, a system which is not understandable by investigating their respective parts in isolation. One reason these identical principles have been discovered in entirely different fields is because people are unaware of what those in other disciplines are doing. General Systems theory attempts to avoid this overlap in research efforts.

There are two main methodologies of General Systems research; the empirico-intuitive and the deductive theory. The first is empirical, drawing upon the things which regularly exist in a set of systems. It can be illustrated fairly easily, but lacks mathematical precision and can appear to the 'scientist' to be na�ve. However, the main principles which have been offered by this method include differentiation, competition, closed and open systems, and wholeness - hardly na�ve or worthless principles. The second method, basically, can be described as 'the machine with input', defined by a set 'S' of internal states, a set 'I' of input and a mapping 'f' of the product I x S into S (organisation is defined by specifying states and conditions). Self-organising systems (those progressing from lower to higher states of complexity, as in many social organisations) are not well suited to this approach, as their change comes from an outside agent. Our anti-virus strategy system is such a system and for this reason we will use the empirico-intuitive methodology.

Classical system theory uses classical mathematics to define principles which apply to systems in general or to subclasses. General System theory can be called the doctrine of principles applying to defined classes of systems. It is our hope that we can stimulate thought on how already-known principles can help us in managing our anti-virus protection by examining the system as a whole.

Return to Top

Holism

Our definition of holism, drawing where appropriate from the medical profession, is health-oriented, and focuses on maintaining and improving the existing health of the system. It does not focus on disease and illness. It is interesting to note that, while we have many terms that relate to compromised and infected systems, we do not seem to have many terms relating to 'well' computers. Holism operates under the assumption that the open system possesses an innate organising principle, with the interdependence of the parts having an effect on the total system health. Holism views symptoms of distress as signalling disharmonic conditions, from which we can learn how to adjust the system (feedback); it is open to a variety of approaches for attaining balance. The focus of holism is heavily slanted toward the correction of causal factors, not symptomatic relief. Thus, the role of the holistic practitioner is to facilitate the potential for healing [7].

Return to Top

Anti-Virus Strategy Systems

Where do our anti-virus strategy systems fit in this picture? We hope to explore some answers to that question by first examining the components of our model system. Keep in mind, however, that the goal of this paper is not to provide you with answers, but rather to stimulate new ways of thinking about the problems we face daily.

Return to Top

Components

Each of the components in Diagram 1 contributes to the overall health of the system. Conversely, each can contribute to the illness of the system. For instance, our computer can contribute to the health of the system by functioning properly. If the hard drive crashes, a disharmonic condition is introduced. Our managers contribute to the overall well-being of the system, as long as they perform correctly. However, if one of them intentionally or unintentionally infects a computer with a virus, he or she contributes to the illness of the system. Our software contributes to the wellness by keeping employees reassured, and by keeping viruses out. If it is disabled by an employee desirous of more speed upon boot, or if it does not do its job in virus detection, it contributes to the illness or chaos in the system. There are other factors not shown, as the anti-virus strategy system model does not stop at the boundary of the company. The model includes your Internet service provider, virus writers, makers of electronic mail front-ends, anti-virus product tech support people and more. For the purposes of this paper, we must draw an artificial boundary. We mention the rest to give you food for thought, and to illustrate that boundaries are not static.

Figure 1. Anti-virus Strategy System - The Environment

Return to Top

Programs Policy and Procedures

(Selection, Implementation and Maintenance)

Where do we begin in examining the interaction of our chosen system elements? Let's start with the software selection. Anti-virus software is selected based on a wide number of criteria (8). While some of these criteria are beneficial, several are counterproductive at best (9). We need to be aware of exactly how our company's software is being chosen, and not leave this vital aspect of software selection up to people who do not have the experience or expertise to make a selection that will maximize your organisation's protection against viruses.

Does your anti-virus software detect all of the viruses which are a real threat to your organisation? Before you glibly answer yes, you should recognise that all products are far from created equal, and that even the best products will not achieve this goal if not properly maintained. Consider the following:

When asked what happens to two blocks of copper initially at different temperatures left alone together in an insulated container, students will reply that the blocks will come to the same temperature. Of course, if asked how they know, they usually say "Because it is a law of nature"...the opposite is true...it is a law of nature because it happens.[10]

Apply this to your anti-virus software. Does it catch viruses because it is anti-virus software? If so, you can depend on it, as its name defines what it is. But, if you even loosely apply this concept, you will see that it is anti-virus software because it catches viruses - and if it does not, then what does that make it?

Remember the following quote:

'If you call a tail a leg, how many legs has a dog?'
'Five?'
'No, Four. Calling a tail a leg doesn't make it a leg' [11]

Maintenance of your software is another critical issue. Maintenance refers not to the upgrade, but to the maintaining of the software on a daily basis. What does it require to run? Are you supplying what it needs to live? Or is it merely surviving? Does it have adequate memory, power, disk space to run optimally and lessen the chance your employees will disable it? Is it in an environment free from other programs which may hinder its performance? If you cannot answer yes to these questions, you are not providing an environment for this element of your strategy system which will allow it to remain viable. It will not survive. Like living systems, the anti-virus strategy system requires a favorable environment, else the system will adapt. Unfortunately, in the case of this system, adaptation can mean software becoming disabled by the user component of the system, or overridden by a competing software component. All this, and we have not even added viruses which by design cause a problem to the system by the introduction of instability.

Even if you have the best anti-virus software, and are running it optimally, there can still be problems. Software is just one part of the strategy system. Policies and procedures play an important role in the overall strategy. Even the viruses we mentioned earlier play a part in this system. Then there are the least predictable aspects of the system, the human beings. How complex is this system? How much should we expect the people involved to understand?

Ackoff defines an abstract system as one in which all of the elements are concepts, whereas a concrete system is one in which at least two of the elements are objects [12]. As you can see, our system is concrete. It is also by design an open system, one into which new components may be introduced. Some of these components are by nature 'unknown' (i.e. actions of people, how software may react, viruses which may appear).

When these components are introduced, we have to consider first how they behave on their own. Next, we have to consider how they would behave in combination with any and/or all of the other elements. Finally, we have to consider how 'things' in general will be if neither of the objects are present. In its most simple form, a two-part system would require four equations, but of course, you can see that as the number of elements increases, the number of interactive equations grows by leaps and bounds [Table 1].


Linear EquationsNonlinear Equations
EquationOne EquationSeveral EquationsMany EquationsOne EquationSeveral EquationsMany Equations
AlgebraicTrivialEasyEssentially ImpossibleVery DifficultVery DifficultImpossible
Ordinary differentialEasyDifficultEssentially ImpossibleVery DifficultImpossibleImpossible
Partial DifferentialDifficult Essentially ImpossibleImpossible ImpossibleImpossible Impossible

Table 1. [From [5]] - Introduction of Elements

One of the systems theory approaches we can draw from here to help illustrate the problem comes from what is sometimes called the Square Law of Computation. This means basically that unless you can introduce some simplifications, the amount of computation involved in figuring something out will increase at least as fast as the square of the number of equations. Consider all of the interactions between humans, computers, and software, and you will see why it is impossible to precisely calculate what the results of all of those interactions will be. We cannot even measure them. In other words, you cannot possibly anticipate all of the problems you will encounter in trying to keep your company's data safe from viruses, because you cannot possibly calculate the interactions which will occur once you begin trying to formulate a strategy. Needless to say, these interactions create 'problems'.

If we examine our anti-virus strategy in various ways, we may be able to see things more clearly. Another helpful way in which we can view our system is as an expression, such as the terms of a set. For instance, the notation:

Let x stand for marriage Let y stand for carriage Let z stand for bicycle

The set [x,y,z] is simple enough for anyone to understand. Using names in sets takes us to the more complex: [The look on your face when you saw your first child, a proof that Vesselin Bontchev is not the Dark Avenger, an atom of plutonium]; wherein the first no longer exists (or possibly never did); the second has not yet existed, and the third is out of reach of the common man.

If you were to be asked for the meaning of the ... in the set [Alan, Dmitry, Fridrik...] would you say the ... represented men's names? Names of programmers? Names of programmers who make anti-virus software? Names of people not from the United States? What is the rule for determining the meaning of what is unstated? Is there some unwritten heuristic of which your employees are not aware? What is the meaning of the three dots in our set?

This has a particular application to policy. Users can easily understand, 'Do not turn the computer off if you find a virus'. Can they as easily understand, 'Do not reset the computer if you find a virus'? Can they understand, 'In the event of a suspected virus, call the administrator or take appropriate action'? What is a suspected virus? Is it any time the computer system seems to act strangely? Is it only when the letters fall off? After all, that's what viruses do, right? What is appropriate action? [Turn off the computer, Call your supervisor, Reboot the computer, ...] What is the meaning of the ... in this set?

Return to Top

Variations on a theme

How well are our strategies doing? As pointed out early on, not very well. Why not? To help answer that question, next we will examine the problems of our strategy using the concept of variation. We recognise the duality of variables as they relate to information processing; the significant values which variables acquire at the two extremes of their respective spectra. Specifically, in order for a system to continue to thrive, information must be processed. Disorder, uncertainty, variety - all must shift from high to low [Table 2].


Disorder, Uncertainty and Variety:
Entropy and the Amount of Information Processed
HighDisorderLow
HighUncertaintyLow
HighVarietyLow
LargeNumber of AlternativesSmall
SmallProbability of an EventLarge
Low Regulation and ControlHigh

Table 2 - Predictable Output

The probability of particular events follows by decreasing from small to large. The amount of regulation and control increases from low to high. We become increasingly sure of the output of our systems [13]. However, viruses introduce a form of disorder with which the human components of our systems are not intimately familiar. While the probability of infection can be calculated mathematically [14], we are unable to calculate the probability of other events related to viral infections[15]. In what ways does this introduced unfamiliarity manifest itself? One manifestation is the appearance of problems.

We typically try to solve most of these problems deductively, to determine the reason for a variation between design and operation or design and implementation. This approach is doomed to failure because it places the blame on the subsystems. We attempt to 'restore to normal' instead of redesigning our system. We formulate plans based on incorrect, incomplete or obsolete assumptions. We neglect to factor in spillover effect, that is, the unwanted effect which actions in one system can have in another. Improving an isolated system may seem the epitome of system integrity. You can have your pure clean computer. Of course, it is virtually useless, unconnected to the rest of the world. Or, perhaps it is the solution. Isolated perfect machines. This would probably create a dissatisfied workforce, however, which would ultimately impact business negatively. In the case of anti-virus strategy, 'spillover' takes on many new dimensions - as many as the human beings with which our machines interface. Can you control all of the aspects of this system? You cannot.

Another factor to consider is the size and extent of our system. Further insight may be gained by considering what is sometimes referred to as the generalised thermodynamic law, which states that the probable state is more likely to be observed than the less probable. While this may incite the physicists among us, it has two parts which correspond to the first and second law of thermodynamics. The first law is hardly worth mentioning (physical reason), but the second is of interest to us. We should be concerned with the limited power of observers when viewing large systems. In other words, we cannot expect our managers to be in every place at once, knowing what is going on with every system, every employee. The concept of boundaries can be used to help solve this problem, but their definition is beyond the scope of this paper [16].

Return to Top

System Failure and Measurement

We say the system is failing for three reasons. It is not performing as intended. It is producing results other than expected. It is not meeting its goal. The objective is NO VIRUSES. However, in addition to often neglecting to define what 'no viruses' actually means, we are frequently unaware of how 'no viruses' can mean different things to different people. Not performing as intended could mean it finds some viruses but not all, or it finds all but only removes some. Unexpected results could mean it crashes 1 out of every 6000 machines, or produces system degradation you did not anticipate (if this is the case, does the fault really lie with the product for producing the degradation or you for not anticipating?) Not meeting its goal most likely means failing to keep out viruses. However, to some people, this is a different goal from 'no viruses'.

How is this possible? Isn't 'no viruses' a simple concept? In a word, no. When there is a malfunction, i.e. a virus is found, the natural tendency is to look for the cause within the system. We tend to blame the problem on the variation of the system from its 'desired' behaviour. It could be the fault of the program, the employee, the policy. We tend to blame the program as it is the part of the system most closely identified with the failure as immediately perceived. However, consider for a moment that, to your employee, 'no viruses' means simply that. No viruses are found. Following that line of thought, finding 'no viruses' would be a system success - that is, until it brought your operation to a halt. You see, to some people, 'no viruses' means that none are seen or observed, and not that none are actually operational in the system. We plan grandiose policies and procedures around finding a virus and make no space for 'no viruses' as a possible failed variation. If you find 'no virus', you need to be very sure it is not due to your employees disabling your software, or your software not finding the virus.

Many system 'improvements' are possible which in reality doom the system. Faulty assumptions and goals are often at the root of this problem. For instance, it is obvious that all of your computer workers must, under dire penalty, refrain from bringing disks from home into your office. You implement this policy. You assume they will comply. Your goal is compliance, not 'no viruses'. If the goal was 'no viruses', you would be forced to be more realistic. Consider the following two statements:

We have clean, working computers and by not bringing in software, we can keep them that way. It will save us all a lot of time, and effort!

If you bring in disks, you will probably infect our office computers. It will cost us all a lot of money.

In the first instance, the focus is on the well machine. Everyone wants well machines. People like to be part of winning teams, and participate in things that are nice.

In the second, the focus is on the sick machine. None of your people would have viruses on their home computers. So, this must not apply to them. And if they do break the rule, you have already set them up to be afraid to tell you. After all, they don't want to cost you a lot of money and they certainly don't want to be known as the culprit for infecting the office computers.

How do we measure the performance of our anti-virus strategy system? Not very well. If we find some viruses, we say it's working. If we don't find any viruses, we say it's working. In some cases, you can apply 'we say it's not working' to these same sentences. There is no standard way in which we measure the success of the entire system. Only in the act of being out of control will the system be able to detect and bring back the control.

Return to Top

Conclusion

The systems approach proposed here is a 'whole system' optimization. Think of it as the configuration of a system which will facilitate optimal performance. There exists, of course, a dilemma, in that at some time suboptimization may be necessary, or even the only possible approach. An approximation which is used may be a great deal better than an exact solution which is not [17]. Nevertheless, our model will attempt to show ways to optimize system performance. Models are how we express things we want to understand and possibly change, designed in terms of something we think we already understand. Models sometimes present problems when you try to translate them into real world activities. With this in mind, I would like to suggest a simple model which may help us begin to find ways to find a solution to the problem of designing a workable anti-virus strategy.

'Models should not so much explain and predict as to polarize thinking and pose sharp questions.' [18]

Using a holistically modelled approach, we would strive to maintain the existing health of the system. This assumes we have a healthy system to begin with. This requires you not depend on your belief that your software is correctly installed and operational, and that your employees know how to use it and are using it, and that your equipment is functional, and that your policies are correct and being followed... It requires that you actually take it upon yourselves to designate people to ensure that your system is optimal to begin with. If you are not willing to do this, you cannot expect to restore the system to health. The focus should shift from 'blame' to 'responsibility'. This may require investment on your part. You may need to update equipment. You may need to train employees. You may need to purchase software. You may need to subscribe to publications which can keep your employees up to date on trends in virus and security matters.

You will need to monitor feedback between various aspects of your anti-virus strategy system. We have not discussed feedback at any great length in this paper, due to the number of elements of the system and the complexity of the feedback. However, using the empirico-intuitive General Systems theoretical approach defined earlier in this paper, you should be able to determine the sorts of feedback which are required to keep your system functioning optimally. If there is NO feedback, you can rest assured your system will fail. Lack of feedback produces entropy. In simple terms, entropy can be called the steady degradation or disorganization of a society or a system. This is not what you want for your system. You want to move the system into organisation and order, high rates of probability and certainty. As we discussed earlier, this happens when information is processed. The information can be communication of any type between any elements of the system.

Our current focus seems to be on the existing illnesses in our systems. If open systems indeed, as suggested, possess an innate organising principle, perhaps we should be paying more attention to what the elements of our systems are telling us. We could learn the sorts of information required to maintain organised reliability. We could learn the amount and types of feedback required to process information optimally, and to keep the system both desirably adaptive and from adapting negatively. We must examine our systems as a whole, including all of the parts, as best we can, to determine what the elements and the system are telling us. In the case of our anti-virus strategy systems, we have yet to determine what that message is. Many of us have not even yet defined the elements of the system, the system boundaries, or the goal of the system.

It is clear that there are disharmonic conditions in the 'Anti-virus strategy systems' of most companies; if there were not, no one would be attending this conference or reading this paper. It is also clear that the way we traditionally approach these problems is not working. We have been using these approaches for a long time, and the problems are not going away. Drawing from the holism model, one thing we can do is examine causal factors, instead of focusing on symptomatic relief. We need to examine more closely the interdependence of the parts of our system, and as security professionals, should facilitate the potential for healing our systems. It is hoped that some of the ideas mentioned in this paper can provide a starting point for this.

The author would like to thank Louise Yngstrom, University of Stockholm, for late night chats on System Theory, above and beyond the call of even academic duty.

Return to Top

Bibliography

  1. 'Virus Encounters, 1995: Cost to the World Population'. Testimony, House Subcommittee on Telecommunications and Finance, Tippett, Peter, June 1993.
  2. 'The Wildlist'. Maintained by Joe Wells.
  3. 'Real World Anti-Virus Product Reviews and Evaluation'. Gordon, Sarah and Ford, Richard, Proceedings of Security on the I-Way, NCSA, 1995.
  4. 'An Introduction to General Systems Thinking', p.3, Weinberg, Gerald. John Wiley and Sons, 1975.
  5. 'An Introduction to General Systems Thinking', p.51, Weinberg, Gerald. John Wiley and Sons, 1975.
  6. 'General Systems Theory: Foundations, Development, Applications', pp.xix-xx, Revised Edition, von Bertalanffy, Ludwig. George Braziller, Inc, 1980.
  7. 'Health Promotion Throughout the Lifespan', Edelman, Carole and Mandle, Carole. Mosby, 1994.
  8. 'Guide to the Selection of Anti-Virus Tools and Techniques'. Polk, T. and Bassham, L. NIST Special Publication 800-5. NIST, December, 1992.
  9. 'Real World Anti-Virus Product Reviews and Evaluation', Gordon, Sarah and Ford, Richard. Proceedings of Security on the I-Way. NCSA, 1995.
  10. 'Semantics, Operationalism and the Molecular-Statistical Model in Thermodynamics', Dixon, John and Emery, Alden. American Scientist, 53, 1965.
  11. Quote attributed to Abraham Lincoln.
  12. 'Applied General Systems Theory', p.39, Van Gigch. John P. Harper and Row, 1974.
  13. 'Applied General Systems Theory', Figure 2.2, Van Gigch. John P. Harper and Row, 1974.
  14. 'Directed Graph Epidemiological Models of Computer Viruses', Kephart, Jeffrey O. and White, Steve, R., Proceedings of IEEE Computer Society Symposium on Research in Security and Privacy, 1991.
  15. 'The Viability and Cost Effectiveness of an 'In the Wild' virus scanner in a Corporate Environment', Gordon, Sarah, 1995.
  16. 'Applied General Systems Theory', p.25, Van Gigch. John P. Harper and Row, 1974.
  17. 'The Development of Operations Research as a Science', pp.59-60, as cited in [4]. Ackoff, Russell. Scientific Decision Making in Business.
  18. 'Some Mathematical Models in Science', Kac, Mark. Science, 166 No. 3906 695, 1969.

Return to Top

About the Author

Sarah Gordon's work in various areas of IT Security can be found profiled in various publications including the New York Times, Computer Security Journal and Virus Bulletin. She is a frequent speaker at such diverse conferences as those sponsored by NSA/NIST/NCSC and DEFCON. Recently appointed to the Wildlist Board of Directors, she is actively involved in the development of anti-virus software test criteria and methods. She may be reached as [email protected]

Return to Top