1.1 BACKGROUND OF THE STUDY
Computer security is information
security as applied to computers and networks. The field covers all the
processes and mechanisms by which computer-based equipment, information
and services are protected from unintended or unauthorized access,
change or destruction. Computer security also includes protection from
unplanned events and natural disasters Morrie Gasser: Building a secure
computer system 1988. In computer security, vulnerability is a weakness which allows an attacker to reduce a system's information assurance.
Vulnerability is the intersection of three elements: a system
susceptibility or flaw, attacker access to the flaw, and attacker
capability to exploit the flaw. To exploit vulnerability, an attacker
must have at least one applicable tool or technique that can connect to a
system weakness. In this frame, vulnerability is also known as the attack surface. Vulnerability management
is the cyclical practice of identifying, classifying, remediating, and
mitigating vulnerabilities" This practice generally refers to software vulnerabilities in computing systems. (Paul A. et al, 2008).
A security risk may be classified as
vulnerability. The usage of vulnerability with the same meaning of risk
can lead to confusion. The risk is tied to the potential of a
significant loss. Then there are vulnerabilities without risk: for
example when the affected asset
has no value. A vulnerability with one or more known instances of
working and fully implemented attacks is classified as an exploitable
vulnerability — a vulnerability for which an exploit
exists. The window of vulnerability is the time from when the security
hole was introduced or manifested in deployed software, to when access
was removed, a security fix was available and deployed, or the attacker
was disabled. Security bug (security defect)
is a narrower concept: there are vulnerabilities that are not related
to software: hardware, site, personnel vulnerabilities are examples of
vulnerabilities that are not software security bugs (Paul A. et al,
Security by Design
One way to think of computer security is to reflect as one of the main features.
Some of the techniques in this approach include:
- The principle of least privilege- this is a case where each part of
the system has only the privileges that are needed for its function;
that way even if an attacker gains access to that part, they have only
limited access to the whole system.
- Automated theorem proving- to prove the correctness of crucial software subsystems.
- Code reviews and unit testing are approaches to make modules more secure where formal correctness proofs are not possible.
- Defense in depth, where the design is such that more than one
subsystem needs to be violated to compromise the integrity of the system
and the information it holds.
Default secure settings, and design to
“fail secure” rather than “fail insecure”. Ideally, a secure system
should require a deliberate, conscious, knowledgeable and free decision
on the part of legitimate authorities in order to make it insecure.
Audit Trails tracking system activity,
so that when a security breach occurs, the mechanism and extent of the
breach can be determined. Storing audit trails remotely, where they can
only be appended to, can keep intruders from covering their tracks.
Full disclosure to ensure that when bugs
are found the “window of vulnerability” is kept as short as possible
(Paul A. et al, 2008).
Security architecture can be defined as
the design artifacts that describe how the security controls (security
countermeasures) are positioned, and how they relate to the overall
information technology architecture. These controls serve to maintain
the system’s quality attributes:
- Assurance services
Hardware Mechanisms that protect computers and data
Hardware bases or assisted computer
security offers an alternative to software only computer security.
Devices such as dongles, case intrusion detection, drive locks, or
disabling USB ports, or CD ROM Drives may be considered more secure due
to the physical access required in order to be compromised.
Secure Operating System
One use of the term computer security
refers to technology to implement a secure operating system. Much of
this technology is based on science developed in the 1980s and used to
produce what may be some of the most impenetrable operating systems
ever. Though still valid, the technology is in limited use today,
primarily because it imposes some changes to system management and also
because it is not widely understood. Such ultra-strong secure operating
systems are based on operating system kernel technology that can
guarantee that certain security policies are absolutely enforced in an
operating environment. An example of such a computer security policy is
the Bell-LaPadula Model. The strategy is based on a coupling of special
microprocessor hardware features, often involving the memory management
unit, to a special correctly implemented operating system kernel. This
forms the foundation for a secure operating system which, if certain
critical parts are designed and implemented correctly, can ensure the
absolute impossibility of penetration by hostile elements. This
capability is enabled because the configuration not only imposes a
security policy, but in theory completely protects itself from
corruption. Ordinary operating systems, on the other hand, lack the
features that assure this maximal level of security. The design
methodology to produce such secure systems is precise, deterministic and
Systems designed with such methodology
represent the state of the art of computer security although products
using such security are not widely known. In sharp contrast to most
kinds of software, they meet specifications with verifiable certainty
comparable to specifications for size, weight and power. Secure
operating systems designed this way are used primarily to protect
national security information, military secrets, and the data of
international financial institutions.
These are very powerful security tools
and very few secure operating system have certified at the highest level
to operate over the range of “Top Secret” to “unclassified” (including
Honeywell SCOMP, USAF SACDIN, NSA Blacker and Boeing MLS LAN) the
assurance of security depends not only on the soundness of the design
strategy, but also on the assurance of correctness of the
implementation, and therefore there are degrees of security strength
defined for COMPUSEC. The common criteria quantifies security strength
of products in terms of two components, security functionality and
assurance level (such as EAL levels), and these are specified in a
Protection Profile for requirements and a Security Target for product
descriptions. None of these ultra-high assurances secure general purpose
operating systems have been produced for decades or certified under
In USA parlance, the term High Assurance
usually suggests the system has the right security functions that are
implemented robustly enough to protect DoD and DoE classified
information. Medium assurance suggests it can protect less valuable
information, such as income tax information. Secure operating system
designed to meet medium robustness levels of security functionality and
assurance has seen wider use within both government and commercial
markets. Medium robust systems may provide the same security functions
as high assurance secure operating systems but do so at a lower
assurance level (such as Common Criteria level EAL4 or EAL5). Lower
level means we can be less certain that the security functions are
implemented flawlessly, and therefore less dependable. These systems are
found in use on web servers, guards, database servers, and management
hosts and are used not only to protect the data stored on these systems
but also to provide a high level of protection for network connections
and routing services. (Morgan K., 2009)
Access Control List and Capability (computers)
Within computer systems, two security
models capable of enforcing privilege separation are access control
lists (ACLs) and capability-based security. The semantics of ACLs have
been proven to be insecure in many situations, for example, the confused
deputy problem. It has also been shown that the promise of ACLs of
giving access to an object to only one person can never be guaranteed in
practice. Both of these problems are resolved by capabilities. This
does not mean practical flaws exist in all ACL-based systems, but only
that the designers of certain utilities must take responsibility to
ensure that they do not introduce flaws.
Capabilities have been mostly restricted
to research operating systems and commercial OSs still use ACLs.
Capabilities can, however, also be implemented at the language level,
leading to a style of programming that is essentially a refinement of
standard object-oriented design. An open source project in the area is
the E language.
The most secure computers are those not
connected to the Internet and shielded from any interference. In the
real world, the most secure systems are operating systems where security
is not an add-on.
Computer security is critical in almost
any technology-driven industry which operates on computer systems. The
issues of computer based systems and addressing their countless
vulnerabilities are an integral part of maintaining an operational
Cloud Computing Security
Security in the cloud is challenging,
due to varied degrees of security features and management schemes within
the cloud entities. In this connection one logical protocol base need
to evolve so that the entire gamut of components operates synchronously
and securely John R. Vacca (ed.): Computer and information security
handbook. (Morgan K., 2009)
Despite significant advances in the
state of the art of computer security in recent years, information in
computers is more vulnerable than ever. Each major technological advance
in computing raises new security threats that require new security
solutions, and technology moves faster than the rate at which such
solutions can be developed. We would be fighting a losing battle, except
that security need not be an isolated effort: there is no reason why a
new technology cannot be accompanied by an integrated security strategy,
where the effort to protect against new threats only requires filling
in a logical piece of a well-defined architecture.
I probably cannot change the way the
world works, but understanding why it works the way it does can help me
avoid the typical pitfalls and choose acceptable security solutions.
This chapter explores some of the classic reasons why the implementation
of security lags behind its theory.
Why are computer systems so bad at
protecting information? After all, if it is possible to build a system
containing millions of lines of software (as evidenced by today’s large
operating systems), why is it so hard to make that software operate
securely? The task of keeping one user from getting to another user’s
files seems simple enough especially when the system is already able to
keep track of each user and each file. In fact, it is far easier to
build a secure system than to build a correct system. But how many large
operating systems are correct and bug-free? For all large systems,
vendors must periodically issue new releases, each containing thousands
of lines of revised code, much of which are bug fixes. No major
operating system has ever worked perfectly, and no vendor of an
operating system has dared offer a warranty against malfunctions.
The industry seems resigned to the fact
that systems will always have bugs. Yet most systems are reasonably
dependable, and most of them adequately (but not perfectly) do the job
for which they were designed. What is adequate for most functions,
however, is not sufficient for security. If i find an isolated bug in
one function of an operating system, i can usually circumvent it, and
the bug will have little effect on the other functions of the system:
few bugs are fatal. But a single security “hole” can render all of the
system’s security controls worthless, especially if the bug is
discovered by a determined penetrator. i might be able to live in a
house with a few holes in the walls, but i will not be able to keep
burglars out. (www.wikipedia.com)
Computer Security Policy
The examples and perspective in this
section may not represent a worldwide view of the subject. Please
improve this article and discuss the issue on the talk page. (John M.,
Cyber-security Act of 2010
On April 1, 2009, Senator Jay
Rockefeller (D-WV) introduced the "Cyber-security Act of 2009 - S. 773"
(full text) in the Senate; the bill, co-written with Senators Evan Bayh
(D-IN), Barbara Mikulski (D-MD), Bill Nelson (D-FL), and Olympia Snowe
(R-ME), was referred to the Committee on Commerce, Science, and
Transportation, which approved a revised version of the same bill (the
"Cyber-security Act of 2010") on March 24, 2010. The bill seeks to
increase collaboration between the public and the private sector on
cyber-security issues, especially those private entities that own
infrastructures that are critical to national security interests (the
bill quotes John Brennan, the Assistant to the President for Homeland
Security and Counterterrorism: "our nation’s security and economic
prosperity depend on the security, stability, and integrity of
communications and information infrastructure that are largely privately
owned and globally operated" and talks about the country's response to a
"cyber-Katrina".), increase public awareness on cyber-security issues,
and foster and fund cyber-security research. Some of the most
controversial parts of the bill include Paragraph 315, which grants the
President the right to "order the limitation or shutdown of Internet
traffic to and from any compromised Federal Government or United States
critical infrastructure information system or network." The Electronic
Frontier Foundation, an international non-profit digital rights advocacy
and legal organization based in the United States, characterized the
bill as promoting a "potentially dangerous approach that favors the
dramatic over the sober response" (Maj. Dawood F., 2012).
1.2 Problem Statement
There is a cogent need for security
measurement on PCs basically to ensure integrity of data from being
compromised and most security systems predefined by windows operating
system are mastered by hackers or intruders. This advent led to the
development of a third party system which can run on windows platform
and perform the same functionality with enhanced features. These
features are accentuated with time scheduling system in which time to
lock and unlock automatically are set.
1.3 Aims and Objectives
The aim of this project is to provide
additional security to an existing password security already
incorporated by the Microsoft Windows Operating System in order to
strengthen the security level of a PC.
The aims are illustrated as follows
- To design a precise lock system for personal computers (PCs) that can be controlled and managed locally.
- To ensure adequate security measure and protect the PCs with a third party security system.
- To strengthen the security level of personal computers by ensuring its applicability across subsystem
1.4 Scope of Study
The system is very seamless and robust
as its area of functionality is elongated to secure the system from
unauthorized access. Sequel to development of several security measures,
password method is seen to be absolutely perfect system to ensure
security with much of ado. This method is a standalone system and it
enhances settings that can be used to automate the lock depending on
time schedule and restriction privilege.
1.5 Significance of the Study
The essence of this application is to
provide adequate security measure to counter any attack from
unauthorized users by enabling an automated system that locks the system
if remains idle for certain period of time as scheduled by the user.
Therefore the crucial points that this application emphasizes on are listed below.
- Protect the user’s computer system if the operational module of the system remains idle for certain specified period of time.
- hashing the login using MD5 encryption method
- Its application is strictly running of non-database system
1.6 Project Layout
- Chapter one covers the introductory part and the background study.
This chapter illustrates introduces the readers to the topic definition
of the project work and what is expected to cover as a whole.
- Chapter two is Literature Review; this focuses on the related topics
and emphasize on the applicability of intended projects with respect to
their functionalities as a standard requirement in a real life,
magazine, journals or articles.
- Chapter Three is a system analysis which illustrates the analysis
engaged in carrying out the proposed system with regard to user’s
requirements and software engineering development life-cycle.
- Chapter four is a system design and implementation, this focuses on
design pattern and architectures that best fit for the designing of the
proposed system. The design can be interpreted with conceptual data
structure and Unified Model Language for visual representation of data
- Chapter Five is a conclusion, summary and recommendation. The
conclusion is a report or documentation, the summary gives the brief
description of the proposed system and recommendation illustrates areas
that need improvement, adjustment and work-on to see better way to
handle the process in term of efficiency, response time and throughput.