Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

How Safe Are You? Information Assurance and You

A Computer Security Expert Looks at Broadcast Security Architecture and the Consequences of Convergence

A Computer Security Expert Looks at Broadcast Security Architecture and the Consequences of Convergence

Broadcasters have converged on a new digital ecology.

One issue at the leading edge of network convergence is information assurance, or IA, the new term for information security. As a troubleshooter in this area, with an eye for the present vulnerabilities of mission-critical systems, let me start by describing how your trust in your own systems has been breached.

Foundations

For the broadcaster, the digital ecosystem is the result of a convergence between computer systems, networks and broadcast technologies. Broadcast facilities have evolved from stand-alone physical analog, to networked-analog, to integrated digital, to networked digital, and now to inter-network digital workplaces.

Modern facilities must maintain symbiotic connections for the World Wide Web, access to electronic mail and Internet broadcast streaming.

The table in Fig. 1 shows these foundation phases along with the typical security architecture used. Of the five technical phases shown, traditional “hardware” engineers control only the first four. In the fifth, trust and security exist only through software.

(click thumbnail)
Once your system attaches to the non-physical environment of the Internet, there is no longer any way to assure who has access to and control of your broadcast equipment. Even at the fourth level, it becomes impractical to monitor who is doing what.

Up until now, the best method to solve this identification and authentication problem has been to “share a secret” between some part of the equipment and those authorized to use it. This software part has come to be known as the “Trusted Computing Base,” or TCB. The TCB is supported by, and often supplied by, the operating system software.

“Hardware” engineers largely are helpless in the face of software attacks against the operating system TCB or the sub-components that trust it. The Internet’s arrival has resulted in an environment for your broadcast facilities and the information you purvey that suddenly is very insecure.

The digital ecosystem

In an alternate view, the broadcast digital ecology starts physically connected in the first four layers of the seven-layer International Standards Organization/Open Systems Interconnect (ISO/OSI) model illustrated in Fig. 2.

(click thumbnail)
Information warfare begins at the log-in prompt, represented here at the end of the transport layer and the beginning of the session layer. This, coincidentally, is the point where you change from hardware engineering to software engineering.

It is at this point in security architecture that the struggle for security commences. Imagine a castle wall topped with a magic parapet. Friends or foes need only the spell to breach the parapet and enter the castle.

Although this is your castle, you cannot see the parapet nor the struggle taking place on your behalf between abstract forces of good and bad.

Allow me to interrupt this fantasy, my dear broadcaster, to mention that once the bad prevails, your production facility is as out of your control as if it had been hit by lightning. This is the essence of information warfare and is the “downside” of what you gained when you converged your system with the Internet.

Current digital technology has made possible signals of awesome quality, with easily manipulated sound and video streams, coupled with magnificent distribution capabilities. Audience acceptance of this evolution has been complete and irrevocable. With this technology came an accessory digital ecology and astronomical risks.

To converged broadcasters, digital audience growth is the important drawing card. To support these expanding audiences are the “roadies” that make the digital show go; the notional diagram in Fig. 3 illustrates specific roles necessary in the life cycle of software projects, such as operating systems.

(click thumbnail)
Developers can be an individual or a corporate entity. These are the most powerful system players. They control the source code, and through this control pretty much everything else. They sometimes, but not always, act as an agent of the developer.

Installers apply the code to meet digital and real-world needs and requirements. They often initially control security objects, but rarely become involved in source code.

Users interact with the software of the digital ecology for entertainment or gain. They generally are not permitted access to either the source code or system security beyond their own identity.

Troubleshooters get the user out of trouble. They diagnose, secure, perform break-fix tasks and often encounter miscoding (bugs) of either innocent or malicious nature that need to be fed back to the developer to allow the system to evolve. Typically, troubleshooters have supervisory access to the security features of the system. Currently, troubleshooters seldom have access to the source code.

Broadcast information operating systems cyclically rely on these roles. Movement between roles is possible in some systems and constrained in others. Beyond the needs of digital broadcast media, other members of the digital ecology fall into the categories in this cycle.

All roles simultaneously constitute parts of the audience – the body of the digital ecosystem. As with most human endeavors, this new converged broadcast digital ecology has plenty of conflict built in.

Conflict

This new form of conflict in information technology exists at the security architecture level of operating systems.

For openers, accept that the operating system you are using has design or management goals “engineered in,” as would any technical system.These goals immediately include:

* consumption of resources (your budget);

* reproduction and growth (resources and platforms); and

* elimination of competing systems (“compatibility”).

Whether these goals are part of the actual operating system code or the corporate/organizational goals of the product developer is a moot point. To function properly, operating systems must have dominance in your business functions.

In most cases, this domination is symbiotic; the operating system enhances and generates value, at least for a time. Operating systems can evolve from symbiotic to parasitic. The unsuccessful are predated and subsumed.

Note well that all operating systems are predatory; they require resources.

Trust

Can operating systems ever be trusted?

Along with the rest of our information society, we find the operating system is well-entrenched in broadcast technology. The operating system has become the principal component that assigns roles and receives human trust.

Originally, operating systems were referred to as Disk Operating Systems. Original developers had many names for the Disk Operating System – CPM, CTOS, UNIX, VMS, PC-MOS – but the one with which we all grew familiar was named DOS.

DOS was the logical glue that uncomplicated the connection of stored information to the user’s processor. While some operating systems were concerned with security from their beginnings, in later network evolutions operating systems were compelled to emphasize security and constrain the information available to particular operating system roles. Many operating systems initially were, and some remain, quite fragile and easily breached, particularly in the area of physical or network compromise.

The present security controversy among operating systems involves the original human readable format of the source instructions that are processed into machine-readable format for use by computing systems. Like the floor plan of the castle, this “source code” element must be analyzed and reviewed to find security weaknesses.

With this criterion in mind, it is possible to divide operating systems into two broad categories, open-source and closed-source.

The developer keeps the source instructions of closed-source systems, and various barriers exist to your knowing the details of how these operate. These systems are “intellectually cheap” to install and easy to use; but they are costly to enhance (unless you are the developer) and to troubleshoot.

The information necessary to operate closed systems is strictly compartmentalized by role. The delineation of roles within closed-source systems makes it difficult to change roles and get from one compartment to another when circumstances require it.

Open-source systems are defined by full disclosure. Users of all types are encouraged to discover how the system operates, including the security mechanisms. It has been suggested that the open-source approach began with developer frustration with the exclusionary tendencies of closed-source.

Open-source operating systems are “intellectually expensive” to install and often are tricky to use; but they are easy to troubleshoot, enhance and maintain. The information necessary to operate these systems need not be role-based. Roles within open-source systems are elastic but can be clearly defined as part of the installation.

Systems designed as closed often become open (disclosed) due to circumstances such as standards formulation, orphaned system disclosure, covert action and reverse engineering. Systems can change from closed to open but rarely go the other way. The vast majority of broadcast systems reflect the overall state of technical development, and are based upon one particular closed system architecture.

Either open or closed, trust in these operating systems is the risk your facilities take.

If you fear that your facility is or will be off the air because the production network has crashed due to a raging virus, vicious employee or vandalizing script kiddies, you need to protect, detect and react to secure your operating systems immediately. If you are lucky (or smart) enough to still be on the air, you need to do a risk analysis of your information system.

The new dismal science

Risk analysis reveals acceptable risk. Risk is the opposite of assurance; it is nothing new. Information risk, the opposite of information assurance, is a new branch of an old tree.

To assess risk, we first must define it. Engineers revel in simple algebraic ways to relate fundamental matters. This equation attempts to do the same, if only by identifying the variables involved:

Risk = function of (Threats, Vulnerabilities)

Threats are forces committed to disruption of your service; vulnerabilities are the opportunities available to disrupt it.

The product that a broadcast facility delivers is information. Yours is the business of assuring your audiences and clients of good information.

Threats and vulnerabilities are the result of system design and implementation choices. Without information assurance, good information for your audience, your reason to exist as a broadcaster falters. Assessing and minimizing digital environmental risk is a new task in an old chore.

Unlike other members of the digital ecology, U.S. broadcasters have serious social and legal responsibilities for the information they maintain and purvey. U.S. law protects a broadcaster’s sources and methods.

This information protection requirement also is in effect in other countries and can place a broadcaster in an adversarial relationship with institutions that maintain excellent information warfare and intelligence capabilities.

When good systems go bad

Digital broadcast facilities operating in this converged digital ecology now are under continuous attack.

These attacks can be passive, such as the transmission of an e-mail virus, benign port scans or server-to-server worm attacks. They can be active, such as targeted attacks, denial-of-service attacks or a disgruntled employee modifying or destroying information.

These acts, malicious or not, can destroy your operation and may already have destroyed the trust your facility enjoyed in the past.

Because operating systems security is still evolving, simple breaches that allow humans to gain access to information beyond their assigned roles often are overlooked. These breaches, called exploits, come in thousands of flavors with hundreds of thousands of variations.

Note that only small subsets of these exploits are designed to extract value or any benefit a human might enjoy. Most security exploits appear to be related to the predatory nature of operating systems.

One wonders how the digital ecology can so completely mimic the natural ecology in that it seems to spontaneously generate agents designed to “thin the herd” and create an environment where those with the strongest survival traits succeed.

Social ostracism or legislation to criminalize this activity is, in my opinion, similar to attempting to pass laws outlawing the common cold. Is the development of exploits the application of a Darwinian methodology? Do the exploits actually represent the change agents of the digital ecology? Is the struggle for information security our exposure to the process of security architecture evolution?

Closed vs. open systems

In my view, open vs. closed source security architecture becomes the biggest evolutionary element in the digital ecology. Proper risk, assurance and security analysis begs answers to the following tactical, strategic, offensive and defensive questions:

* Would open-source be a good choice as a defensive system?

* When facing an adversary who uses open-source software, does this adversary hold any advantage?

* Can opponents using open-source-based tools and techniques dominate closed-source opponents in information warfare?

* Is open-source strategically or tactically superior to closed-source systems?

* Will closed-source systems always predate open-source systems?

* For gaining control or dominance over information, is open-source particularly suited as an offensive weapon?

* Can sources and methods be made anonymous?

* Is open-source code harder to trace?

* Is a closed system more secure than an open one?

* Does the compartmentalization that closed-source code instills help security?

* Is empirical comparative evidence or casework available?

* Is abstract or mathematically rigorous comparative proof of superiority possible?

* Would open-source systems and closed-source systems in threat environments perform in similar or different ways?

Convergence has taken audience, personnel, resources, technology and innovation from traditional broadcast outlets and placed broadcasting in the position where broadcasters must converge or fade into obsolescence. Simultaneously we enter a digital environment as broadcasters, and this digital evolution morphs us into something new.

Will this digital domain allow us to the trust we need to be effective media?

Call to action

You are studying this periodical to involve yourself in this industry. This discussion of the consequence of convergence poses many more questions (17, in fact) than normally would be polite.

The proper closed-source approach would be to ignore these questions. The open-source way shares the answers to these questions as well as those that would follow.

The one thing that we can count on is that if we don’t find the answers to these questions, someone else will.

RW welcomes other points of view.

Close