Tech 581W Computer Network Operations, Laboratory 1: Team 5


In this lab we review the pros and cons of penetration testing, methods and tools used in penetration testing, and the creation of lab environments for performing penetration testing.  One method of penetration testing is a technological, tool-based approach for attacking various points in a system that corresponds to the OSI seven layer model.  In addition to the original seven layers, we examine an additional two layers, people and kinetics.  We create a matrix that gives examples of tools that could be used for attacks and which OSI layer they correspond to along with the McCumber Cube coordinates that the tool would impact.

Literature Review

The primary focus of the literature in Lab 1 is the creation of various laboratory environments to support penetration testing and an attack-based security strategy.  Some of the papers also focus on the benefits of using an attack-based strategy vs. a defensive strategy.   These two topics are most certainly connected and a good starting point for our TECH581W class as acceptance of an attack-based strategy for security training is assumed.

The first paper in this set to talk about the benefits of using ethical hacking services and penetration testing is (Gula, 1999)which, though it is a vendor whitepaper, has many interesting points about items that are often overlooked in penetration testing.  The publisher, Enterasys, does business in the field of penetration testing so that needs to be taken into consideration when reviewing the contents of the paper.  The author doesn’t do as well of a job selling penetration testing as he does pointing out what is often missed in them, assumedly to show that his company’s service is superior.  There is no quantitative research given to show that the list of items given are often missed by penetration testers, simply, that they “normally go untested as considered by the author (Gula, 1999, p. 6).”  The history of penetration testing along with a good argument on its necessity based on the amount of risk an organization is willing to tolerate from electronic intruders is discussed in (Coffin, 2003).  The author shows a very simple example of what a penetration test may cost vs. down time for an electronic commerce site.  Of special note is the author’s inclusion of the limitations of penetration testing and ethical hacking.  Just because a site may not be compromised by the testers does not necessarily mean that it is secure, just that the particular tester did not find any vulnerabilities at that point in time with their specific skill set.  In (Arce & McGraw, 2004), the authors discuss the various aspects of attack-based penetration testing and acknowledge that “the majority opinion is that the only way to properly defend a system against attack is to understand what attacks look like as deeply and realistically as possible (p.17).”  The authors acknowledge the prevalence of the buffer overflow attack and discuss new, more advanced methods of buffer overflow attacks.  Also important to a red team in a penetration test is documentation of the methods employed and their results.  In (Ray, Vemuri, & Kantubhukta, 2005) the authors describe a method of using XML code to make their attacks reusable for other engagements.  This not only creates a standardized form of documenting what was done but allows for easier sharing of the information with other testers in the organization.

The one paper in this first group of readings that detailed weaknesses in penetration analysis was (Du & Mathur, 1998, p. 2).  The authors suggest that penetration testing requires one to know about flaws that might exist already in the system which can significantly affect the validity of the test, and this is certainly something that cannot be overlooked.  As mentioned in (Coffin, 2003), an unsuccessful penetration test does not mean that the system is completely secure.  Du & Mathur (1998) introduce the idea of fault injection into systems.  This method of testing attempts to cause the system to fail by injecting faults and bad data.  The authors present a model for identifying areas that would be good for testing and a methodology for analyzing results.  Fault injection complemented by penetration testing would yield optimal results but would require separate teams of experts for each part.
The bulk of the readings in Lab 1 discuss the creation of lab environments and curricula to facilitate training security practitioners in the field of penetration testing.

(Holland-Minkley, 2006; Micco & Rossman, 2002; Mink & Freiling, 2006) all discuss teaching attack-based security classes and the necessity of having a lab for testing.   While (Micco & Rossman, 2002) was primarily concerned with detailing issues that may arise when running a cyberwarfare class, the other two papers were concerned with the “success” of the class which they based on empirical evidence gathered from students.  Another topic raised in these papers regarding the curricula was getting permission from the institutions to run a class that could train potential attackers of their networks.  One important thing stressed in the classes was ethics and responsibility surrounding the knowledge gained through the class.  In (Mink & Freiling, 2006) the basis of the article goes over should student learn about attacking systems and what can be learned from it. It also discusses is defense more important than attacking. Upon reading the paper they also created a method to measure the results of this type of learning but it strays away from what the tile of the paper is. When they say “is attacking better than defense” it is thought that the method would measure the benefits and disadvantages between attacking and defense. As with security and usability it would not be possible to have one without the other and it all depends on the circumstances the professional is in where the importance of one may take over the other.

Due to the nature of penetration testing, many of the tools used can be harmful to normal operation of systems.  Because of that, when teaching ethical hacking classes, lab environments need to be designed so that they do not impact the rest of the organization’s systems or even the Internet at large.  The course instructors in (Micco & Rossman, 2002) designed a lab that was cut off completely from the rest of the school’s network and only had an Internet connection through a modem with a special agreement with the internet service provider.  In the DETER testbed, the organizers designed a system that was accessible remotely to multiple users that could handle running simultaneous, isolated tests

(Benzel et al., 2006, p. 2).  Another method of providing an isolated environment for the purpose of penetration testing is virtualization.  In (Heien, Massengale, & Wu, 2008, p. 76) the authors utilize Qemu to create a virtualized environment with multiple clients and subnets all without the need for any additional hardware.  VMWare was evaluated in (Micco & Rossman, 2002, p. 24) but the authors thought it would be of little benefit.  They chose Linux because it was free over NT being “much more expensive.” creating a homogenous testbed which at the time probably only represented a very small fraction of the operating system market.  The authors do concede that a heterogeneous network would have been much more realistic.  Our lab environment for this class will consist of an isolated network between hosts running in VMWare.  This will allow us to maximize department resources and run multiple virtual hosts simultaneously without interrupting the other teams.

Lab Exercise

In addition to creating our lab environment, we were also tasked with creating an attack matrix based on a 9 layer OSI model combined with the McCumber Cube (McCumber, 1991).  By examining the layers of the OSI model we are able to identify attack tools that could exploit a system at each of those levels.   By adding the McCumber Cube coordinates it adds an additional layer of visibility in to what areas an organization may need to focus its efforts on.  We examined tool suites such as Backtrack combined with Internet search engines and our personal knowledge to compile the list.

OSI Layer


McCumber Cube Coordinate

People/8, Careerbuilder, Dropbox,, Facebook,, Myspace, Twitter, Spy Confidentiality, Storage, People
Saboteur Integrity, Storage, People
Application/7 Absinthe, Checkpwd, Cisco Auditing Tool, Cisco Global Exploiter, Dig,Dnstracer, Dnsmap, Finger Google, Fuzzer, GFI LanGuard, Hydra, JBroFuzz, John the Ripper, Lodowep, Maltego, Medusa, Metagoofil, Metasploit, Metoscan, Mibble MIB Broswe, Mistress, Nessus, Nikto, Nmbscan, Oracle Auditing Tools, PBNJ, Peach, PStoreView, RevHosts, RPCDump, Sidguess, SMTP-Vrfy, SQLbrute, Sqldict, Subdomainer, VNC_bypauth, whoami, WyD, Confidentiality, Storage, Technology
Achilles, Dmitry, Halberd, Httprint, Mezcal, Relay Scanner, TFTP-Brute, VNCrack, Confidentiality, Transmission, TechnT
XSpy Confidentiality, Processing, Technology
Sqlupload, Tini, Zodiac Integrity, Storage, Technology
Pirana, Wapiti Integrity, Transmission, Technology
Barrier Integrity, Processing, Technology
Dnswalk Availability, Storage, Technology
Bed, Cisco Torch Availability, Transmission, Technology
Presentation/6 OpenSSL-Scanner Confidentiality, Storage, Technology
Squirtle Integrity, Processing, Technology
Session /5 Fierce, Mbenum,  PcapSipDump, PcapToSip_RTP,  SIPcrack SIPDump, SIPp, SIPSak Confidentiality, Transmission, Technology
FPort Integrity, Processing, Technology
CIRT Fuzzer, ICMPTX, Juggernaut, NSTX Availability, Transmission, Technology
Network/3 Afrag, Airbase-ng, Aircrack-ng, Airdecap-ng, Airodump-ng, Airmon-ng, Airplay-ng, Airpwn, airoscript, AirSnarf, Angry IP Scanner, ASLeap, Autonomous System Scanner, Autoscan,Cheops-ng, EtterCap, Fping, IKE-Scan, IKEProbe, itrace, Mailsnarf, Netdiscover, Netenum, Netmask, Nmap, PHoss, Ping, Protos, PSK-Crack, Psyche, ScanLine, Wireshark Confidentiality, Transmission, Technology
IRDP Responder Integrity, Transmission, Technology
Data Link/2 Bluesnarfer, BTcrack, BTscanner, Redfang Confidentiality, Transmission, Technology
Bluebugger, Blueprint, Bluesmash, Cain & Abel, Carwhisperer, File2Cable, Gobbler, HCIDump, MacChanger, MDK3, Minicom,  ObexFTP, SMAC, Smap, Ussp-Push, Wellenreiter, Wicrawl, WifiTap, WifiZoo Integrity, Transmission, Technology
Physical/1 Fiber tap, Hardware Keylogger,”Pringles can” antenna, TEMPEST, Wiretapping Confidentiality, Transmission, Technology
Kinetic/0 Arson, Nuclear Bomb, Ship Anchor, Shovel Availability, Transmission, Technology
EMP Availability, Processing,Technology
Dumpster Diving Confidentiality, Storage, Process
Lock Pick Confidentiality, Storage, Technology


Since the original seven layer OSI model exists inside a computer system, the majority of the tools identified fall in the technology area of the McCumber cube rather than people or process.  Any interaction which requires cognition and action on the part of the user would move into the people or processing area of the McCumber cube.  Based on this one might see a substantial bias towards a tool-based, technological approach to penetration testing and, while that is certainly an important part of the testing procedure, a majority of the areas to attack are outside of the technology realm and, in many cases, are substantially easier to attack.  If this matrix were presented to a customer, it should be noted by the testers that it’s only an aid in seeing where these technological tools affect their organization, not a breakdown of the threats and vulnerabilities they may be exposed to.


Many of the tools found in the list from Backtrack could fall in to multiple categories so we took the approach of placing the tool where we felt it was best suited for impacting confidentiality, integrity, or availability.  Essentially, where we felt the tool would do the most damage.  One of the most difficult placements was Wireshark since this tool can operate at layers 1 through 7.  We felt that by placing it at layer 3 we would be able to monitor a vast majority of network traffic since most is IP based and use that common factor for working up the stack when necessary.  Another issue encountered was a misunderstanding of the amount of tools to be listed in the threat matrix.  This was noted by the instructor before submission and required a rebuild of the original table.


This lab was a serious mental exercise that stretched our thinking patterns on how we view tools.  By placing such a wide variety of tools into a matrix we saw patterns develop at different layers of the OSI model.  We saw a heavy focus on tools that affected confidentiality and data storage in the application layer while we saw tools affecting confidentiality and data in transmission in layers 3 and 4.  These findings would be useful when developing a plan for penetration testing by showing where to concentrate efforts and what tools to use when attacking the various layers of an organization’s security.

Works Cited

Arce, I., & McGraw, G. (2004). Guest Editors’ Introduction: Why Attacking Systems Is a Good Idea. Security & Privacy, IEEE, 2(4), 17-19.

Benzel, T., Braden, R., Kim, D., Neuman, C., Joseph, A., Sklower, K., et al. (2006). Experience with DETER: A Testbed for Security Research. Paper presented at the Proceedings of the 2nd IEEE Conference on Testbeds and Research Infrastructures for Development of Networks and Communities.

Coffin, B. (2003). It Takes a Thief: Ethical Hackers Test Your Defenses. Risk Management Magazine, 50, 10-14.

Du, W., & Mathur, A. (1998). Vulnerability Testing of Software System Using Fault Injection. Purdue University.

Gula, R. (1999). Broadening the Scope of Penetration-Testing Techniques: Enterasys Networks.

Heien, C., Massengale, R., & Wu, N. (2008). Building a Network Testbed for Internet Security Research. Paper presented at the Sixth Annual Consortium for Computing Sciences in Colleges Mid-South Conference.

Holland-Minkley, A. M. (2006). Cyberattacks: A Lab-based Introduction to Computer Security. Paper presented at the Proceedings of the 7th Conference on Information Technology Education.

McCumber, J. R. (1991, October, 1991). Information Systems Security: A Comprehensive Model. Paper presented at the 14th National Computer Security Conference.

Micco, M., & Rossman, H. (2002). Building a Cyberwar Lab: Lessons Learned: Teaching Cybersecurity Principles to Undergraduates. Paper presented at the Proceedings of the 33rd SIGCSE Technical Symposium on Computer Science Education.

Mink, M., & Freiling, F. C. (2006). Is Attack Better than Defense?: Teaching Information Security the Right Way. Paper presented at the Proceedings of the 3rd Annual Conference on Information Security Curriculum Development.

Ray, H. T., Vemuri, R., & Kantubhukta, H. R. (2005). Toward an Automated Attack Model for Red Teams. Security & Privacy, IEEE, 3(4), 18-25.


Layer 8 – – –

Dropbox –

Facebook –

Myspace –

Monster –

Twitter –

Layer 7

Absinthe –

Achilles –

Barrier –

Bed –

Checkpwd –

Cisco Auditing Tool –

Cisco Global Exploiter –

Cisco Torch –

Dig –

Dmitry –

Dnstracer –

Dnswalk –

Dnsmap –

Finger Google –

Fuzzer –

GFI LanGuard –

Halberd –

Httprint –

Hydra –

JBroFuzz –

John the Ripper –

Lodowep –

Maltego –

Medusa –

Metagoofil –

Metasploit –

Metoscan –

Mezcal –

Mibble MIB Browser –

Mistress –

Nessus –

Nikto –

Nmbscan –

Oracle Auditing Tools –


Peach –

Pirana –

PStoreView –

Relay Scanner –

RevHosts –

RPC Dump –

Sidguess –

SMTP-Vrfy –

SQLbrute –

sqldict –

sqlupload –

Subdomainer –

Tini –

VNC_bypauth –

VNCrack –

Wapiti –

whoami –

WyD –

Yersinia –

XSpy –

Zodiac –

Layer 6

OpenSSL-Scanner –

Squirtle –

Layer 5

Fierce –

Marathon –

Mbenum –

PSTools –

SIPcrack –

SIPDump –

SIPp –

SIPSak –

Smap –

Layer 4

0trace –

Amap –

CIRT Fuzzer –

CryptCat –

Driftnet –

Dsniff –

Etherape –

Firewalk –

FPort –

Hping –

Hping2 –

Hping3 –


Iodine –

InTrace –

Juggernaut –

Matahari –

Netcat –

P0f –

Privoxy –

ProxyTunnel –

Taof –

TcPick –

TCPtraceroute –

TCtrace –

Scanrand –

SinFP –

Spike –

SuperScan –

UnicornScan –

XProbe2 –

Layer 3

Afrag –

Aircrack-ng –

Airdecap-ng –

Airplay-ng –

Airmon-ng –

Airpwn –

AirSnarf –

Angry IP Scanner –

ASLeap –

Autonomous System Scanner –

Autoscan –

Cheops-ng –

Ethereal –

Ettercap –

Fping –

itrace –

IKE-Scan –

IKEProbe –

IRDP Responder –

Mailsnarf –

Netdiscover –

Netenum –

Netmask –

Nmap –

PHoss –

Ping –

Protos –

PSK-Crack –

Psyche –

ScanLine –

Wireshark –

Layer 2

BlueBugger –

Blueprint –

Bluesmash –

Bluesnarfer –

Btcrack –

Btscanner –

Cain & Abel –

Carwhisperer –

File2Cable –

Gobbler –


MacChanger –

MDK3 –

Minicom –

ObexFTP –

PcapSipDump –

PcapToSip_RTP –

Redfang –

Ussp-Push –

Wellenreiter –

Wicrawl –

Wifi Tap –

WifiZoo –

12 comments for “Tech 581W Computer Network Operations, Laboratory 1: Team 5

Leave a Reply