precedence: bulk Subject: Risks Digest 21.10 RISKS-LIST: Risks-Forum Digest Tuesday 7 November 2000 Volume 21 : Issue 10 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS (comp.risks) ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator ***** See last item for further information, disclaimers, caveats, etc. ***** This issue is archived at and by anonymous ftp at ftp.sri.com, cd risks . Contents: [Election Day special edition] Pennsylvania county wins $1M for faulty computer voting machines (David Banisar) Thoughts on computers in voting (Douglas W. Jones) Security of electronic voting in public elections (Avi Rubin) Saturn made a bad assumption in my engine (William Colburn) I crashed because my phone was ringing (Scott Gregory) Unplanned roll in NASA's X-38 (James H. Paul) *Lack* of barcode causes train to trap passengers (Jeff Stieglitz) No security in Internet-connectable laboratory instrument controller (Stephen D. Holland) Risk of using 'meaningful' file names (Charles Bryant) Re: Typo+"strange glitch"=private files world-readable (Steve Summit) REVIEW: "Virus Proof", Phil Schmauder (Rob Slade) Abridged info on RISKS (comp.risks) ---------------------------------------------------------------------- Date: Thu, 2 Nov 2000 09:24:10 -0500 From: David Banisar Subject: Pennsylvania county wins $1M for faulty computer voting machines A federal jury awarded Montgomery County, Penn., more than $1 million Wednesday in a suit against an Indiana company that sold the county 900 computer voting machines that repeatedly broke down. The jury found MicroVote Corp. breached its implied warranties, but rejected all of the county's other claims, including fraud and breach of contract. The county had sought $4.3 million against the company. [Source: *The Legal Intelligencer* ] ------------------------------ Date: Tue, 7 Nov 00 16:43:41 CST From: "Douglas W. Jones" Subject: Thoughts on computers in voting It's election day, and as chair of the Iowa State Board of Examiners for Voting Machines and Electronic Voting Systems, it seems like a fair time to pause and think about the state of the art. Over the past several years, an important trend has been evident in the voting machines that have come before our board for approval in Iowa. This is the replacement of custom-built software with off-the shelf commodity software, usually some variant of Windows and largely dependent on Microsoft Office. Computers in voting machines are old technology at this point, whether they're used for central count systems based on punched cards or mark sense readers, or whether they're precinct count systems based on mark sense or direct recording electronic voting machines. There are still lever machines in use, of course, but those haven't been changed in years and therefore, we don't see them coming up for examination. Under the current Federal Election Commission guidelines for electronic voting systems, all custom-built software is subject to examination by an independent third party. On the other hand, "industry standard components" are acceptable, as is. The FEC has no enforcement power, but the FEC guidelines have been enacted into the voting law of numerous states. The reason this concerns me is that we see a larger and larger fraction of the software inside the voting system becoming proprietary product of a third party and exempt from the requirement that it be available for a source code inspection. Furthermore, the size of commercial operating systems is immense, so an effective inspection is very hard to imagine! What threat does this present? If I wanted to fix an election, not this year, but 4 years from now, what I might do is quit my job at the University of Iowa and go to work for Microsoft, seeking to insinuate myself into the group that maintains the central elements of the window manager. It sounds like it might be fun, even if the job I'd need would largely involve maintenance of code that's been stable for years. My goal: I want to modify the code that instantiates a "radio button widget" in a window on the screen. The specific function I want to add is: If the date is the first tuesday after the first monday in a year divisible by 4, and if the window contains text containing the string "straight party", and if the radio buttons contain, at least, the strings "democrat" and "republican", one time in 10, at random, switch the button label containing the substring "democrat" with any of the other labels, at random. Of course, I would make every effort to obfuscate my code. Obfuscated coding is a highly developed art! Having done so, what I'd have accomplished is a version of windows that would swing 10 percent of the straight party votes from the Democratic party to the other other parties, selected at random. This would be very hard to detect in the election results, it would be unlikely to be detected during testing, and yet, it could swing many elections! This is just one example attack! There may be similar vulnerabilities, for example, in the off-the-shelf database packages being used for ballot storage and counting. I don't mean to this example to reflect any ill feelings toward Microsoft, but it is true that their software is used in the vast majority of new voting systems I've seen. This threat does not require any cooperation from the vendor of the window manager or other third party component exempt from source code inspection. All it requires is a mole, working their way into the vendor and producing code which is not detected by the company's internal testing and inspection. Obfuscation is easy, and the art of the "easter egg" in commercial software makes it very clear that huge numbers of unofficial features are being routinely included in commercially released software without the cooperation of the software vendors. (OK, I know that some easter eggs are officially approved.) Having said this, it is worth noting that Microsoft has indicated a preference about the outcome of today's presidential election, and there are excellent reasons to treat proprietary software produced by a partisan agency with great suspicion when it is included in a voting system! My conclusion? The time has come for computer professionals to press for a change to the guidelines for voting machines, asking that all software included in such machines be either open source, available for public inspection, or at least open to inspection by a third party independent testing authority. There are no technical obstacles to this! Linux, Free BSD and several other fully functional operating systems are available and will run on the hardware currently being incorporated into modern voting machines! But, this is not the end of the problem! How do you prove, after the fact, that the software in the voting machine is the software that was approved by the board of examiners and tested by the independent testing authority? No modern machine I'm aware of makes any real effort to allow this proof, although several vendors do promise to put a copy of their source code in the hands of an excrow agency in case a question arises. Doug Jones http://www.cs.uiowa.edu/~jones/voting/ [Note: Doug, Rebecca Mercuri is just putting the finishing touches on her PhD thesis on the subject of electronic voting, at the University of Pennsylvania. I highly recommend you contact her for a copy, which should be available very soon. For everyone else, we will announce it here when the thesis is ready. Also, my book *Computer-Related Risks* has lots of background on risks in electronic elections and what to do about them. Rebecca has carried the analysis much further than I did. Her thesis will be a very valuable contribution that significantly raises the bar as to what should be demanded, not just hoped for, plus an analysis of the residual risks that would still remain. PGN] ------------------------------ Date: Fri, 20 Oct 2000 15:46:02 GMT From: rubin@research.att.com (Avi Rubin) Subject: Security of electronic voting in public elections I recently participated in an interesting workshop sponsored by the NSF by request of President Clinton on the feasibility of e-voting in public elections. The workshop web page is http://www.netvoting.org/ From the Web site: "On October 11 & 12, 2000, the Internet Policy Institute (IPI) will conduct a workshop to examine the issues associated with conducting public elections via computer networks. Sponsored by the National Science Foundation (NSF) and chaired by C.D. Mote, Jr., president of the University of Maryland, this workshop is part of a request by the White House to study the feasibility of Internet voting." It was a collection of technical experts, Social Scientists, election officials, the Department of Justice, and the NSF. The workshop was fascinating. The technical participants were: Erich Bloch, Washington Advisory Group Lorrie Cranor, AT&T Labs - Research Michael Fischer, Yale University Dan Geer, @Stake, Inc. Lance Hoffman, George Washington University David Jefferson, Compaq Systems Research Center Carl Landwehr, Mitretek Systems, Inc. Raymond Miller, University of Maryland Adam C. Powell, III, The Freedom Forum Ron Rivest, Massachusetts Institute of Technology Avi Rubin, AT&T Labs - Research Barbara Simons, Association for Computing Machinery I spoke about security challenges/risks associated with remote electronic voting related to host security and Internet availability. I was asked to write up my comments. The paper is available at http://avirubin.com/e-voting.security.html Avi Rubin http://avirubin.com/ [Added at PGN's request] The workshop (http://www.netvoting.org/) held in October in DC was sponsored by the NSF by directive from President Clinton to study the feasibility of electronic voting in public elections. Subject matter experts were invited from the social science community, the technical and security community, election officials, and representatives from the department of justice. The meeting was chaired by Dan Mote, the president of the University of Maryland. Panels were held discussing issues such what e-voting means, whether or not e-voting would improve accessibility, whether it would widen the digital devide, and whether more people would vote. On the technical side, there were panels about the security requirements, the current state of security on desktops as related to voting. The mandate was to cover the following issues: * How to ensure the security and reliability of the voting process; * How to protect the privacy of voters; * How to authenticate voter identity; * How to achieve broad and equitable access to online voting systems; * How to assess the impact of online voting on representative democracy and community; and * How to ensure that online voting systems are convenient, flexible, and cost-effective. The group is going to produce a report that will be submitted to the White House and to Congress and to election officials all over the country. My participation was as a panelist on security. I wrote up my comments. They are available at http://avirubin.com/e-voting.security.html Avi ------------------------------ Date: Tue, 7 Nov 2000 13:33:56 -0700 From: "Schlake ( William Colburn )" Subject: Saturn made a bad assumption in my engine I have a 1-year-old Saturn. As a safety feature, my Saturn will prevent me from going faster than is safe with my suspension or tires. When I first got the car, I had to try this feature out, so I found a long straight road and floored it. When I got to 105MPH the engine lost power and I slowed down. Experimentation revealed that I couldn't regain power until I dropped below 100, then I could accelerate again. A couple of days ago I drove through a fairly steep chasm with a road straight down one side and up the other. I figured I needed as much momentum as possible, so I pushed the clutch in and coasted down. Somewhere along the way I hit 105MPH. Just as I was starting up the opposite side I noticed that virtually all of my warning lights were on, and the engine was at 0RPM. A still engine means no power steering and no power brakes. I'm quite glad there weren't any turns or traffic that might have forced me to turn or brake. The problem was the assumption that I got to an excessive speed by using the engine to accelerate. The default action works great when the clutch is engaged. In my case, I ended up with a car that suddenly became very hard to control when I was already doing something unsafe. ------------------------------ Date: Mon, 23 Oct 2000 13:44:00 -0400 From: "Gregory, Scott" Subject: I crashed because my phone was ringing On Yahoo news today, they carried this message from the Reuters feed "Smart Tires to Warn Drivers Via Mobile Messages". http://dailynews.yahoo.com/h/nm/20001023/tc/tires_phones_dc_1.html It details impending tire developments from Finland to put Bluetooth enabled chips in their product. The tire will phone the driver if the pressure drops too low. Future developments include detection of wear, as well as hydro-planing (tire losing contact with the road due to water). Standard security type risks, which phone gets the message? Even better, timing and distraction risks. Like the fighter pilot with so much information that they cannot cope, the modern driver may soon have a phone ringing that they are losing control of the car. "Really officer, it was an important call from my car that I had to answer. And see, I did hit the tree it told me I was going to." Reading my analog speedometer. sdg ------------------------------ Date: Mon, 6 Nov 1972 16:46:10 -0500 (EST) From: "James H. Paul" Subject: Unplanned roll in NASA's X-38 *Aviation Week & Space Technology*, 6 Nov 2000, p. 24 "NASA's X-38 Vehicle 131R did a slow, 360-deg. roll after release from its B-52 carrier aircraft on Nov. 2. It was the first free flight of the vehicle, which automatically stabilized under the preprogrammed deployment of a drogue chute and made a successful landing under parafoil on a dry lakebed runway, as scheduled, at Edwards AFB, Calif. The vehicle sustained no damage in the test. Project officials said they would have to do some trouble-shooting to figure out why the Crew Return Vehicle (CRV) prototype rolled at an estimated average rate of about 20 deg. per sec. during its 24 sec. of scheduled free flight. A software problem in the vehicle's flight control system was suspected, although project officials were also looking at whether aerodynamic disturbances immediately after separation might have played a role. Actual separation from the B-52 was clean, and the flight control system maintained angle of attack throughout the 18-sec. roll. The vehicle is an 80%-scale version of the CRV designed to provide emergency escape for International Space Station crews." There's an F-16 test pilot somewhere thinking, "Been there, done that." James Paul Annandale Virginia ------------------------------ Date: Fri, 20 Oct 2000 15:09:22 -0700 From: Jeff_Stieglitz@toyota.com Subject: *Lack* of barcode causes train to trap passengers Dozens of travelers were stuck on an underground passenger train at Denver International Airport on 19 Oct 2000 after a computer problem sent the train shooting past the main terminal. It took workers about 20 minutes to move the train to a station, where passengers got off. No one was injured. A circuit board on the automated train lost its memory and failed to read a bar code that signals it to stop. The train overshot the station and safety mechanisms kicked in. [Source: AP item, 20 Oct 2000, PGN-ed] http://www.cnn.com/2000/WORLD/europe/france/10/20/france.trial/index.html The RISKS archives are of full of engineering designs without fail-safe features. One would think that the train would have a hardware interlock to stop the it if something disturbed the computer. Fail-safe also suggests that sensor events are required to enable and maintain motion rather than stop it. The lack of sensor input or keep-alive should result in a graceful shutdown. I'd guess that this train requires a computer to unlock the doors, rather than a fail-safe design that would require the computer to keep them locked. ------------------------------ Date: Fri, 20 Oct 2000 15:36:25 -0400 From: "Stephen D. Holland" Subject: No security in Internet-connectable laboratory instrument controller National Instruments (http://www.ni.com) sells a device for gatewaying between ethernet (TCP/IP) and the IEEE-488 (GPIB) bus commonly used for controlling laboratory instruments, such as oscilloscopes, voltmeters, motion-controllers, etc. I was somewhat astounded, upon purchase of this device, to find that it had no security whatsoever. That is, if you properly configure it and attach it to your instruments, anyone in the world with the proper software can control your lab equipment. Worse, there is no mention in any of the documentation or marketing materials that security is an issue. The manual even suggests "If you are directly linked to the Internet... you can contact the... support department to update your firmware." without any consideration of the risks of the device being connected. Marketing materials also promote Internet-connected use without discussion of the risks involved. Securing this device requires putting it on its own ungatewayed ethernet segment. This is not mentioned in the manual, and reduces the utility of the device (cannot share existing wiring). The RISKS: - End users could unknowingly assemble systems that are open to attack or accidental disruption by intruders. Most laboratory scientists are not particularly well-versed in the minutiae of network security. As GPIB is commonly used for mechanical control, there is the real danger of physical damage. - Security issues are still sufficiently esoteric that a respected and generally competent company such as National Instruments can develop (and market for several years) a device for the Internet that has no security whatsoever. I am hoping that National Instruments can develop a firmware update that adds at least a minimal passcode for access control. Steve Holland, Dept. of Theoretical and Applied Mechanics, Cornell University sdh4@tam.cornell.edu ------------------------------ Date: 5 Nov 2000 23:36:26 -0000 From: Charles Bryant Subject: risk of using 'meaningful' file names A Milton Keynes Council worker sent a reply to one or more people who were commenting on proposals for a travelers' halting site. The letter had an embarrassing addition to the intended text. In small letters at the bottom was the file name: "H:\Gypsy letter to whingers.doc". Of course this risk is equally the risk of not proof-reading a printed copy. Charles Bryant - ch@chch.demon.co.uk ------------------------------ Date: Sat, 4 Nov 2000 06:30:37 -0800 (PST) From: scs@eskimo.com (Steve Summit) Subject: Re: Typo+"strange glitch"=private files world-readable (RISKS 21.09) I'm not sure why "no one was sure exactly what triggered the glitch". It's reasonably obvious what happened: a sensitive file was accidentally left in a directory the web server could get to, the local search engine dutifully indexed it, and from then on it was sitting there just waiting for someone's search to unearth it. Steve Summit ------------------------------ Date: Mon, 23 Oct 2000 08:18:27 -0800 From: "Rob Slade, doting grandpa of Ryan and Trevor" Subject: REVIEW: "Virus Proof", Phil Schmauder BKVRSPRF.RVW 20000711 "Virus Proof", Phil Schmauder, 2000, 0-7615-2747-8, U$34.99/C$48.95/UK#32.49 %A Phil Schmauder %C 3875 Atherton Road, Rocklin, CA 95765-3716 %D 2000 %G 0-7615-2747-8 %I Prima Publishing/Jamsa Press %O U$34.99/C$48.95/UK#32.49 800-632-8676 www.primapublishing.com %P 273 p. + CD-ROM %T "Virus Proof: The Ultimate Guide to Protecting Your PC" On the very first page of this book we are told that viruses are written to steal or destroy "information that resides on your disk." (Viruses are written to reproduce.) The text then contradicts itself by saying that viruses may just print a message. Then we are told that you should never run programs downloaded from the Internet (downloading infected program files has always been a relatively trivial vector). Along the way we are told such vital information as that viruses must get into your computer's RAM in order to do damage (*everything* has to get into your computer's RAM in order to do anything) and that viruses are exchanged on disks or transferred files (that pretty much covers the field of data transport, wouldn't you say?) Welcome to "Virus Proof," a collection of mistaken, valid, useless, and repetitive information. Sharp-eyed readers will have noted the inclusion of "valid" in that list. Unfortunately, you will have to be much more acute to pick out the true facts from the volume under discussion. As the old saying goes, if you can tell good advice from bad advice, you don't need any advice. Some of the errors in the book simply show that the author has not done his homework. (There is no evidence to suggest that the Michelangelo virus was written to "commemorate" the birth of Michelangelo the artist. The researcher who first reported the existence of the virus learned that the target date of March 6 was Michelangelo's birthday, and so used that name as a convenient label.) Some of the errors in the book are more seriously misleading. (The Michelangelo virus did not "occur" on March 6, 1992. It was, fortunately, discovered long before, possibly existed before March of 1991, and still results in regular computer erasures every March 6th to this date.) The author does keep telling the reader not to use any data file, or run any program, until it has been scanned for viruses. That is good advice, as far as it goes. Unfortunately, it isn't very useful advice, and the constant repetition of that single injunction is likely going to dull the reader to the necessary finer points. The directive to scan everything isn't the only thing that gets repeated in the book. The first chapter manages to tell us once per page that computer programs are lists of instructions. Now, that statement is true: programs are sets of commands. But that bald assertion provides the normal computer user with no insight that could help with virus protection. One would think that the space dedicated to this piece of trivia could more helpfully be employed in presenting an accurate definition of viruses, or a list of the ways that you are more likely to get a virus these days. In only four pages, chapter two presents serious misinformation. A boot sector does not show up on a list of files on a disk. Boot sector infectors can infect non-bootable, and even "blank" disks. Trojan horse (or just "trojan") programs do not reproduce. A file infecting virus is not referred to as a "Trojan Horse virus." The definition given for a worm (if you are making a distinction the term "worm virus" makes no sense) clearly contradicts the declaration that a worm could also be a file infector. Most macro languages are not capable of supporting a successful virus: to date, only those written for Microsoft applications have presented any danger. And so it goes. Virus writers don't need your password, and system security breakers (who dearly love the confusion of the term "hacker") don't bother with viruses. Being the first on your block to upgrade to new versions of programs can have drastic security risks itself. If you are not supposed to run anything you download from the Web, why are you supposed to upgrade your software over the Internet? Since viruses are appearing at the rate of hundreds per month, keeping up with the few that make it into [large AV corporation]'s press releases is unlikely to be very useful. Mailing lists and newsgroups are recommended without any analysis. Most recent email viruses and worms harvest addresses for regular correspondents, so the direction to avoid email attachments from someone you don't know is almost worthless. Firewalls have nothing to do with viruses. If a virus infects a system file, knowing what programs are running on your computer is useless. Many loopholes have been found in the security of ActiveX controls: restricting operation to signed controls provides very little protection. Backups will help you recover if hit, but provide no inherent virus protection. Knowing how to break into systems will not protect you from viruses, nor will seven pages of C source code for a variant of the Crack program. (For those script kiddies eager to learn how to break into systems, save your money. It doesn't tell you that, either.) Phone phreaking isn't that easy, trying the stuff in the book can get you arrested, and it has nothing to do with viruses. (And John Draper's own account, given on the site illustrated, contradicts the story in the book.) Chernobyl is a variant of CIH, and not the other way around. Backing up the Registry provides no inherent virus protection. Anonymizers for email and Web browsing have nothing to do with viruses. Cookies have nothing to do with viruses. (Many of the points made about cookies are incorrect as well.) Happy99 used Usenet news, as well as email. Spam has almost nothing to do with viruses (and most of the recommended actions are not only useless, but will annoy people who have better things to do). The material on virus hoaxes is limited, physically hard to read (small print), and has no real analysis. Chat has nothing to do with viruses. Denial of service attacks have little to do with viruses, chapter sixteen has *nothing* to do with viruses, and neither do six pages of SYNattack source code. Privacy has nothing to do with viruses (and chapter seventeen has little to do with privacy). Email encryption has nothing to do with viruses. The Melissa virus was not polymorphic. Polymorphic viruses do not change their payloads. Virus "families" result from virus writers taking a given virus and making very minor changes to it. Digital signatures have little to do with viruses, and chapter nineteen does not discuss key management at all. JavaScript is not a "cut down" version of Java, and does not have Java's security model. E-commerce does not have anything to do with viruses. Y2K does not have anything to do with viruses. And, fortunately, the code presented in chapter twenty five is nowhere near sufficient to create a working virus. (It is enough is create serious problems for the person who tries to use it.) Now, of course, a number of the items mentioned do have something to do with general security. Unfortunately, the level of detail given in the book is far from sufficient to protect the user against these threats. Indeed, the threats themselves are not described particularly well, and I could go through a very similar exercise in pointing out the weaknesses in the general security material. Given the total size of the book it really isn't a work on viruses. It throws together a random assortment of information (and misinformation) about a variety of security related topics. Nothing is covered in depth, and nothing is covered completely accurately. Approximately half of the book is occupied with screenshots of miscellaneous Web sites, not always to do with the topic under discussion (and a number of which are repeated at random through the work) so this detracts even more from the material that could have been provided. A pamphlet on viruses surrounded by some opining on security issues buried within a lot of careless research. copyright Robert M. Slade, 2000 BKVRSPRF.RVW 20000711 rslade@vcn.bc.ca rslade@sprint.ca slade@victoria.tc.ca p1@canada.com http://victoria.tc.ca/techrev or http://sun.soci.niu.edu/~rslade ------------------------------ Date: 15 Aug 2000 (LAST-MODIFIED) From: RISKS-request@csl.sri.com Subject: Abridged info on RISKS (comp.risks) The RISKS Forum is a MODERATED digest. Its Usenet equivalent is comp.risks. => SUBSCRIPTIONS: PLEASE read RISKS as a newsgroup (comp.risks or equivalent) if possible and convenient for you. Alternatively, via majordomo, SEND DIRECT E-MAIL REQUESTS to with one-line, SUBSCRIBE (or UNSUBSCRIBE) [with net address if different from FROM:] or INFO [for unabridged version of RISKS information] .MIL users should contact (Dennis Rears). .UK users should contact . => The INFO file (submissions, default disclaimers, archive sites, copyright policy, PRIVACY digests, etc.) is also obtainable from http://www.CSL.sri.com/risksinfo.html ftp://www.CSL.sri.com/pub/risks.info The full info file will appear now and then in future issues. *** All contributors are assumed to have read the full info file for guidelines. *** => SUBMISSIONS: to risks@CSL.sri.com with meaningful SUBJECT: line. => ARCHIVES are available: ftp://ftp.sri.com/risks or ftp ftp.sri.comlogin anonymous[YourNetAddress]cd risks [volume-summary issues are in risks-*.00] [back volumes have their own subdirectories, e.g., "cd 20" for volume 20] http://catless.ncl.ac.uk/Risks/VL.IS.html [i.e., VoLume, ISsue]. http://the.wiretapped.net/security/textfiles/risks-digest/ . ==> PostScript copy of PGN's comprehensive historical summary of one liners: illustrative.PS at ftp.sri.com/risks . ------------------------------ End of RISKS-FORUM Digest 21.10 ************************