Copyright Michael B. Scher This document may be freely distributed by electronic media solely for non-commercial purposes. Reproduction in any form must be of the entire work including this copyright notice. Print copies (except a personal use copy) with permission of the author only. All other rights reserved.

strange(at)cultural.com


General content for a recurring annual talk at John Marshall Law School 1999-2002 on infrastructure security and technical means of protecting it for David Loundy's class, IT 848: Computer Crime, Information Warfare, and Economic Espionage.
------------------------------------------------------------------
This used to be a topic I didn't have much to say about:

	- Since the NIPC was formed, the focus was unfortunately on the
	hot hot publicity-generating topic of HACKERS, of information systems
	and cyber attack on the nation.  Compare the focus of the 1998
	White paper on all kinds of physical and information 
	infrastructure with the subsequent obession with cyber-whatever.

	- I used to have little more to say than to point out
	the near-sightedness of that approach: critical systems, including
	the information systems that support them, are ALL attackable in much
	more mundane ways, especially where social disruption is the goal.

	- There are signifiant non-redundancies in fiber, telephone, and power
	networks, if one knows where to look for them.  We have only to look at
	the downtown Chicago power outages over the last several years to see
	significant issues that could affect information systems supporting
	other criticial infrastructure.  They are designed mostly to prevent
	against accident and natural disaster.

	- Obviously, some of this focus has changed, and now with a solid
	focus on physical protection, it is not unreasonable to ask about
	electronic protections.


In his statement to the Joint Economic Committee, Mr. Gershwin says
that, according to industry leaders, criticial infrastructure info systems
are:
	a) designed to be less accessible than other info systems,
 and	b) use "unique, proprietary or archaic" programming languages.



But this assertion on the part of industry is neither entirely true, nor, 
if true, should it give the slightest sense of protection.  Quite the 
contrary.

1)	Deregulation exchange points MUST be standards-based; 
	increasingly, that's website-style systems over standard TCP/IP
	on so-called "private" networks into which all the players attach
	from their Internet-connected main networks.

	Even the auto industry parts network is less foolish -- they went
	and pushed for the creation of IPSec -- an encrypted version of
	the Internet standard protocols before rolling out their private
	exchange network.  And even there, encryption is only a small part
	of securing the network.

2)	Even where it is proprietary or obscure, they then have almost
	no established skillset oriented around anything other than keeping
	it up and running -- i.e., no one understands how to lock it down
	or audit it.  And a determined attacker may have much more time 
	and patience to figure systems out.

3)	It turns out that most of these systems are interconnected,
	use standards-based, even common software and systems at their
	increasingly commodity-based cores, and are increasingly hard to
	audit.

SO:
How do information systems supporting critical infrastructure wind up vulnerable?

Example 1:
	The gas and electric company.

In late 1997, my colleagues and I were retained to take an intruder's 
eye's view of a Gas and Electric company's security.  We had limits: 
only entries over the Internet and via telco (modems, etc.) were 
permitted.  No physical access.  We had the name of the company, some of 
their phone numbers (from which their large block was deducible), and 
little more.  With just that and ordinary lookup tools, directories, and
so on, we easily mapped out their borders.  Then we tried getting in. 
What we wound up finding was incredible, to us, then.

	GasElCo is a mid-sized natural gas and electric company serving
	a large userbase from a central location.  The area is several 
	hundred square miles with over a million customers.  

	Their power grid systems automatically report various usages, etc.
	into their accounting and monitoring systems, some live, some in
	batches.  Grid control also connects to those systems.

	These proprietary systems were in fact five or so years-out-of
	-date UNIX servers with known vulnerabilities widely published by
	the Operating System manufacturer.  However, the maker of the
	"turnkey" system supplies the grid control software and servers as
	a bundle, and never updates the OS independently of the whole 
	package.  Even so, some of the versions of OS were so old they
	obviously never had been updated.  Most of their systems sent
	passwords over their network unencrypted.

	Their systems were very mutually trusting, and almost 
	accidentally internetworked.  The skillset to lock down those
	servers was not present; the skillset to design a network that
	could cut the servers off from each other except for that which 
	was NECESSARY to communicate was irrelevant:  The ways they needed
	to communicate were archaic, proprietary, and insecure.

	Their remote power monitoring and control stations were reached by
	ordinary modem (automated), and the same username and password 
	let one into each of them.

	The corporate HQ alarm system had modem access, and a test or
	training account with full access to respond to (or shut off) alarms
	and watch security's response to alarms, was left active.  Worse,
	the username was taken from the "friendly power guy" in their ads,
	and the password was the same as the username.  It was an old,
	archaic system that was also in need up updating.

	Their training systems could be broken into from the Internet, 
	from which there was limited access for remote support by the  
	manufacturer, which helped conduct some training by remote.  Their 
	training systems could log into the live grid control systems, 
	for trainees to look around. That is, their live grid control
	was readily reached from the Internet.

	Their automated accounting systems had username/password protected
	modems on them, straight into the core of the trusted network.

	The corporate dial-in system for users to log in and do regular  
	work, however, was state of the art, and frankly, we couldn't 
	break into it.


The newly-rolling-out digitial cellphone network.

In mid-1998, I again participated in a semi-blind penetration test of a
utility.  This time, the target was a regional telco's new digitial
cellphone network.

They assured us the phone switches were in no way on the Internet, but
on a private network of their own.  Now, that usually means a virtually
private network on a shared medium.  However, we figured they would
somehow connect their normal mail and internal networks to the telco
gear networks -- otherwise the engineer in a cube would need two systems.

There was limited access through a firewall for a critical partner to help
troubleshoot critical applications.  Dialup was also ill protected.  The
network reached through the firewall was connected for accounting, HR,
cost, and other reasons, to all parts of the company.  But there wasn't a
gateway onto the switching network.  What we found instead was that every
engineer's desk computer (an archaic, outdated UNIX server the switch
maker had supplied with the system, and not updated) had a connection into
each network, making the entire security of that second network subject to 
each of those desktops.  Needless to say, bringing up screens to control a 
phone switch across the Internet has a way of shaking your faith in 
the security of things.


This situation is becoming worse, not better.

I recently responded to a cleanup for the Nimda worm at a fortune 500 
financial.  All in all, they were fairly able to handle it, and took 
very little actual financial hit compared to less prepared corporations. 
In the wrapup, however, we were told one division's phone system was down.

Turned out it was based on Windows NT and the vulnerable version of the 
web server that Nimda breaks into.  The manufacturer was out of business, 
and having a 3rd party rebuild them would run over $90,000.  We wound up 
cleaning the systems for them, a nasty task, because Nimda copies itself 
in the place of tons of critical programs, given time.

Printers, cablemodems, routers, FAX MACHINES, networked monitoring 
devices, and more crashed from Nimda -- they were subject to similar 
problems.

And now, we find that the latest in voice over IP systems from a large
communications manufacturer -- a system designed for large corporate
settings... has Windows 2000 and the newer version of the same web server 
at its core in a package the end-user is entirely unable to update 
piecemeal.

-Increasingly, traditionally self-contained turnkey systems are networked.  
-They are increasingly built on standard operating systems.
-By the time they hit the market, those operating systems have numerous 
published, easily exploited holes.
-By the time the manufacturer passes around updates that fix the holes, 
it's well over half a year more.
-Most end-using companies won't update a working system unless the 
manufacturer says the problem is critical.  
-Most manufacturers will almost never admit to a critical problem.

Heck, there are FIREWALLS out there built, turnkey-style, on standard 
operating systems.  Those operating systems had a hole discovered a good 
year ago, and patched it, and it remains UNPATCHED on the firewall product 
-- a full YEAR in the public eye.



Does this term fill you with a sense of peace and safety:

	Microsoft Air Traffic Control 2004

?

Then why would something like:

	Herculon ATC 4.9

... when it's built 100% out of out of date, unpatched Microsoft 
components?


When the Air Force CIO recently told Microsoft to tighten up security or 
say goodbye to Air Force contracts, much more than desktop operating 
systems, e-mail, and webservers were at stake.

We know another company that the USAF is making put a medical information 
system through a standards-based security assessment, plus a bit of extra 
lockdown requirement from the AF itself.

With luck, we'll see a trickle into the commercial sector over time.


I wouldn't have thought computer security was rocket science, but it's 
likely to be trickling out of military research and requirements, much 
like much of our other technology.

---------------------------------------------------------------------