Cloud Computing Cloud Computing

A. The important factor that the network documentation has is that it
enhances better understanding since both the staff and users need to
understand the respective areas that are concerned with the work
The implementation of the network security document should start for the
company objectives and requirements that the network developers require
in order to design a network system that would be applicable within the
company setup. However, at ties the company requirements may not fit
within the implementation, the network designers are supposed to
correlate the network and train the system managers within the
company so that the users and management would be trained on how to
adapt to the new network system so that the company obligations would be
accomplished accordingly (Kaushik, 2009).
Compared to other companies that sue such systems, the company is
supposed to use network template so that they would be able to come up
with a logical system that is able to accomplish the customer and user
requirement to the fullest. The intention to draw the attention of
customers and users to the network integrity is achieved through
hierarchical approach to network management, the uniqueness of the new
system is a bust to security since most of the intruders are used
to the common systems (Mirchandani, 2010).
Not everybody within the company is able to observe the company
regulations the security department has to be handled with care so that
the company would not encounter cases of system failure or intrusion.
The malicious individuals are able to rely on information obtained from
the company members and documentation in order to design a system that
would customer the available system (Mirchandani, 2010).
Since the company is able to retain, fire and promote the employees,
there is need to settle on individuals with good characters and
perceptions towards the progress of the organization in the network
security dockets. This would ensure that the requirement of the company
is accomplished and information is handled by responsible employees with
integrity and etiquette required within the allocated portfolios
(Collier, 2011).
c. Design a logical and physical topographical layout of the current and
planned network
The ability to overcome the company challenges is based on the security
provided by the technology. The main recommendation that the system is
able to implement in that, data handling is managed by specific
group of individuals with the mandate, not everybody would be able to
handle data hence data security and integrity is implemented. This
creates confidence in the company management and data handling procedure
(Collier, 2011).
The ability to provide all the users and customers with the required
information at any tome within any geographical location is an
advantage since it reduces the cost of movement and products
allocation since every product that is offered by the company and its
competitors would be easily availed within the cloud, which is a secure
and protected area for both the national and international marketing
(Mirchandani, 2010).
Project Role Description ITS Resource Other Agency Resource
Sponsor Sustain executive and organizational commitment and support for
the warehouse project.
Communicate business direction changes to senior management.
CIO Executive officer for ITS.
COO Executive officer for ITS hosting.
Computing Services Executive Sustain executive and organizational
commitment and support for the data warehousing application statewide
rollout for hardware and OS components
Communicate business direction changes to ITS senior management for
hardware and OS components
Communicate any potential changes to the customer that may impact
warehouse hosting services
Software & Systems Integration Director Responsible for delivering
infrastructure needed to support environments hosted at ITS.
Provide daily operational support for the environments at ITS.
Operations Director Responsible for supporting data center environmental
needs for applications hosted at ITS.
Provide daily operational support for the environments at ITS.
Provide daily platform monitoring and management of HW & OS.
Operations Manager Responsible for coordinating with the project team to
provide power and space for hardware needed.
Telecommunications Services Executive Oversees ITS Telecommunications
and Networking.
Oversees Network Monitoring Team.
Network Monitoring Manager Statewide IP Network
Statewide SNA Network
IP Data Services
Security Services
Network Management
Network Design & Support Design network infrastructure
Support network infrastructure
Firewalls Configure firewall to meet application needs and follow
security protocols
Project Manager Initial POC
Hold initial meeting to explain concept, approach, roles, and timeline
Maintain open communication with all groups.
Utilize standard and procedures.
Coordinate activities among multiple parties.
Publish documents to respective parties.
Overall project status reporting.
PMO Office Assist with project planning.
Assist with project schedule planning.
Assist with project communication.
Assist with project risk assessment and risk mitigation planning.
Assist with post project reviews.
Technical Project Integrator Provide leadership on the technical
components of the project.
Ensure communication and collaboration among the technical teams working
on implementation and ongoing support. This will include managing
cooperative relationships with the project team members and vendors.
Enable ITS to establish a hosting environment that is architected,
deployed, and maintained in a manner that meets or exceeds SLAs to be
established between parties.
Service Coordinator Manager Responsible for providing service
coordination for the Computer Services division.
CS Project Coordinator Support PM & Technical Integrator on all project
activities for the Computing Services group at ITS.
Customer Service Representatives Follow identified processes to serve
IT Database Manager Responsible for managing database services for
databases located at ITS.
Database Manage Database Resource.
Monitor system traffic.
Trouble shoot any DB issues.
UNIX Manage UNIX environment.
Trouble shoot any UNIX issues.
INTEL Manage INTEL environment.
Trouble shoot any UNIX issues.
IT Storage Management Manager Responsible for managing storage
management team.
Storage Design Architect & design storage plan.
Storage Operations Manage SAN environment.
IT Systems Planning and Design Manager Responsible for managing systems
planning and design team.
Server Performance & Capacity Server & CPU monitoring.
Backup Design Define backup strategy/ methodology.
Backup Operations Administer and monitor backup process.
Disaster Recovery Define Disaster recovery strategy / methodology.
Administer and monitor DR process.
Trust Zone Methodology POC to initiate TZP process for DPI.
Initiate filling out documents for TZP.
Trust Zone Placement POC for determining hardware placement within
security zones based upon archetcture needs.
Procurement Hardware / Software purchases.
Platform Security / Vulnerability Scans Scan OS to identify any security
vulnerability issues and create baseline metrics.
Platform Services MAPS (Managed Application Platform Services) – load
and manage the OS and most of the services it provides.
Manage and monitor the hardware deployed by NOS (Solaris/Unix, Windows,
Netware/Linux, etc).
Name the servers and prepare the Server with TS requires for IP address
assignment and firewall rule generation.
Facilities Responsible for power and AC.
a. Create and describe a comprehensive security policy for this
This is a service model that uses analytical recursive technology in
elements of data processing through a public ISP. This model of
application is offered on a “pay” basis however it is being control
wholly by the ISP subscribers (Kaushik, 2009). Cloud analytics
applications and services are typically data respiratory that offer all
applications on a centralized management system. Cloud computing is
scalable and less expensive. Hosted data warehouse, software -as-service
of business intelligence and cloud-based social analytics media
(Kaushik, 2009).
There is a need to involve remote tools that enhance diverse collection
of applications since it serves the purpose for multi-business
applications hence a hosted cloud is like a centralized repository for
enterprise data hence making it an enterprise resource own systems
hosted on the enterprise server rather than being remotely hosted by the
service provider (Mirchandani, 2010).
Cloud Data Collection & Analysis tends to reduce cost by making sure
that all their products are enlisted in the cloud data bank, hence as a
result avoids the use of web as a business advertisement, most of the
transactions are done online since your clients and suppliers have the
authentication to access the organization supplies department and even
finance thus the operating cost goes less and less rapidly within a year
(Kaushik, 2009). The organization growth is normally by 75% in less than
a year. Web Analytics combines it with operational systems data allows
one stop shop for documentations and manipulation of recourses. That`s
why most business plan are increasing its business importance due to
good marketing plan (Mirchandani, 2010).
Concurrently, the technology offers better business outsourcing
processes since most companies can now work remotely and access the
central server using cloud computing therefore hence the company should
be ready to find ways on how to protect the collected data and still
wants to ensure that the installed IT infrastructure will manage the
expected growth that the corporation is anticipating (Mirchandani,
Most business finds it important to exploit information hence these
gets to be even more important as new flow of precious information
emerge leading to accelerated data growth. Super high security entails
streamlining and integrating software’s that are capable to manage the
cloud technology process laying emphasis on internal and external
security (Kaushik, 2009).
Systems administrators in the Company should understand their roles in
executing and implementing the IT systems. To ensure that the project
design initiatives are sustainable (Kaushik, 2009). The organization
should identify and put in place requisite infrastructure, human
capital, and development of performance-oriented policies to guide the
implementation process. Significant work load is relieved by the
organizations that use the cloud technology those that use local
computer application will find it easier for them since they will no
longer move from one office to the other looking for a file or a
resource (Collier, 2011).
Most server based applications to contain webmail server and central
receptors server. The only difference they have is that they tend to
utilize recourse from a virtual computer hence the applications tends to
run from a server configured computer independently. The application
renders the hardware irrelevant at work (Collier, 2011). There is a
decrease of software demand on user’s side. Cloud computer is made up
of network segments that handle them instead. Cloud computing systems
interface software is the to be able to run the user`s computer needs
access software that are simple Web browser, and the network cloud`s
segmentation takes care of the rest (Kaushik, 2009).
b. Information privacy being vital to any organization where by
data-collection and analysis should be accessed by any user in that
users should have privileges on what to access , in that sensitive
document within the organization should not be accessed by anybody e.g.
financial statement and organization strategic goals should not be
leaked to outsiders (Collier, 2011). With restrict network policy the
organization is able to authenticate the source of information and the
destination (Collier, 2011).
Logical and Physical topographical
Logical topology may also be referred to us signal topology where it is
the arrangement of devices on a computer network and how they
communicate Logical topography also defines how the systems communicate
across the physical topologies. Logical topologies are bound by network
protocols which describe how data is moves across the network
infrastructure. The logical topology can either be shared media
topology where all the systems have the ability to access the physical
layout and token-based topology uses a token to access the physical
media, while Physical topology is how devices are interconnected to the
network through actual cables that transmit data to the physical layer
of the network it represents the physical layout of the devices on the
network (Kaushik, 2009).
All in all both logical and physical topology are part of the network
The logical and physical topologies have to be considered I order to
realize a collective responsibility in the analytical system this would
enhance communication between the logical and physical layers which are
essential in the communication process (Mirchandani, 2010). The main aim
and objective of analyzing the logical and physical system is to enhance
security of the information system so that the company and employees
would be safe when working with a secure system that is likely not to
allow the intruders to compromise data security within the organization
(Mirchandani, 2010).
Through the study of the available network security policy, the network
managers and technicians would be able to understated better the kind of
security system required in order to ensure that the company management
system is secured from the intruders and the members who would
maliciously interfere with both the company data and information so that
the developed system would have the required integrity and information
security (Collier, 2011).
The network requirements and implications would be another factor that
should be considered from the documentation in order to enhance the
required systematic and sequential data security, the allocated
sequential structure enables the respective members of the
organization to have specific rights which vary duties and privileges
of the respective users within the organization (Collier, 2011).
Infrastructure and Security
A network infrastructure is an interconnection of computer systems
linked through telecommunications architecture parts. This
infrastructure may refer to the organization of its configuration and
the other various parts that include networked computers to routers,
cables, wireless access points, switches, backbones, network protocols,
and network access methodologies (Kaushik, 2009). There are two types of
infrastructures: these are open architecture and closed architecture
where open architecture is usually the internet that can be accessed by
all networks while closed architecture is for a private intranet most
within a particular organization. All the architecture can operate over
wired or wireless network technology (Kaushik, 2009).
The simple infrastructure can be merely interconnection of computer
through a hub where the hub merely links the computers, but doesn’t
limit data flow to or from systems. In order to limit or control access
between systems and regulate information flow, we install extra gadget
called a switch that is able to create network protocols that define how
the systems communication (Collier, 2011). To allow the cross network to
network communication, via the network connection, a router is required,
which bridges the networks and basically provides a common language for
data exchange, according to the rules of each network (Mirchandani,
When setting up a network infrastructure Network security is a primary
concern so that an organization can achieve the security policy goals
core goals of confidentiality, integrity, and availability of
information. Some architectures use routers with built-in firewalls,
together with software that allows finely-tuned user-access control,
data packet monitoring, and strictly defined protocols. Security can be
enhance by controlling and adjusting network sharing properties on
every single system, this limits the folders and files that can be
accessed by other network users (Collier, 2011).
(3) Core goals of confidentiality, integrity, and availability of
2 d. While accessing data, it is inherently safer than the typical
client-server model distributing data such as email tends to remain on
the end user account in the data packet are largely uncontrolled.
Intrusion detection systems (IDS) that are placed between the host
server and the external server that uses EGP exterior gateway protocol
have been the greatest detections agent for malicious behaviors in
network communication residing the organization mainframes (Collier,
2011). For distributed capability, IDS management solution is the
technology to go for hence integration has been made possible to handle
different types of sensors or collect distributed within the environment
thereby synthesizing alerts generated from multiple hosts located within
the server (Kaushik, 2009).
Extensibility, efficient management, and compatibility to
virtualization-based needs to be introduced into many existing IDS
implementations. Additionally, providers of cloud need to enable
possibilities to deploy and configure IDS for its users. Vendors and the
organization itself might undergo risk of harm due to proper validation
of the central data (Mirchandani, 2010).
There is always a chunk of information that are harbored inside the
database since cloud computing is a command driven and clients are able
to access the curly services at any time without any breach hence
leading to a cost reduction of trunk calling systems that provide voice
over curly services thereby offering cheap option to curly and emailing
services. Monitoring of the network is always real time and traffic
filtration takes place once anomalies have been identified within the
Intrusion Detection Systems (IDS) (Kaushik, 2009).
Activities with potential suspicion are normally identified by the use
of signature of IDS systems database hence the potential risks are
quickly notified by the security administrators Cloud computing provides
the organization scale since it offers key elements like e-mail, word
processing and video graphics capability. All these functions can be
outsourced depending on your entire IT infrastructure and the ISP who
will offer cloud services to your organization hereby forecasting on
your core business activities especially in the case of a startup
(Collier, 2011).
To avoid over flow of departmental scrum cloud usage is the way to go in
order to access additional IT resources to fulfill a short-term
requirement and long term objectives. Evaluating and testing nonstandard
software can be easily active with good use of cloud this enable
organizations to save on cost and. Problem of bottleneck normally occurs
as a result of all departments fighting for the same resource that is
scares hence leading to a cloud burst (Mirchandani, 2010).
The control of data in a specific geographical area sited in cloud
computing makes the enterprise essence of security being a compromise
since regulations dictates how the entire structure operates Some set of
application might need basic standards of access to the data, of which
are applicable internally. Time out and response time are extremely
influential in that delays due to piggy backing traffics the system
would respond in disarray there after leading to time out (Kaushik,
2 e. Information Security Risk Management
Establish context
Security is appropriate for different level of organizations based on
the departmental threats and vulnerability. These threats normally are
dynamic since it can be generated form all direction being that it s
generated internally or probed from the external environment. The
scale up of the curly has out rightly allowed information sharing.
However, numerous targets are created within the curly hence taking
advantage of its good use this malicious usage of the curly has created
a high level of insecurity within the work environment (Mirchandani,
Risk assessment
Security threats are as a result of exploitation of weak spots within
the network hence creating vulnerability. Impact of unaddressed
vulnerability generates high potential of risk Risks thereby making the
organization exposed to the competitors who might take advantage of the
attack. Lack of constant updates of windows leads to risk of being
attacked (Kaushik, 2009). After a server has been compromised there is
always downtime of the network and subsequent loss of data these
normally take the organizations time and money, Hacking on your network
come in many different varieties (Collier, 2011).
Many times the hackers do not even know who they are attacking, but
many a times there are instances of networks or organizations being
targeted intentionally. It’s prudent that systems administrators
should learn the different methods used to compromise computers and
networks, giving you an edge over your enemy will give you the necessary
perspective to proceed.
You need to take precaution, and establish your potential threats within
your network and them in your firewall (Kaushik, 2009).
Creating Security Policies
For you to understand the security you need to be part of the policy
creation team this will help you understand your networks. Without the
clear boundary on terms of the policy your database system will run the
risk of being threatened. Security policies provide the path on the
proper way to protect your network. Technical resources should be used
appropriately to enforce the policies (Collier, 2011).
Information security principle
Constant access of information from the networked server given
restrictions of access and authentication requires a lot of secrecy and
commitment oaths to the organization. The access requirement is based on
a need to know basis, “accounts department cannot access the human
resource department with given limitations based on the needs of the
user”, otherw (Collier, 2011)ise privileges can be granted on a need to
know basis. User control systems, can be generated at a central server
using active directory, these ensure that the names are registered with
active keys that are normally confirmed whether they are the true user
name and password corresponds to the enlisted names in the Domain name
Information resource requires a high level of trustworthy hence
providing the user with a high level of unquestionable integrity.
Changing data inconsequential and deliberate realigning its content
without following the proper protocol or alteration of the data before
it reaches its intended destination as regarded is improper ethical ways
of work (Mirchandani, 2010). Validity of actual data within the systems
that correspond to the original data from the source always provide a
high level of trust and reliability Whether the data was wrongly
transmitted or entered without verification from the source corrupting
the same data would be against the policy hence data integrity is
questionable (Kaushik, 2009).
Information system should be accessible a locally available when its
needed these recourses are vital to the end user. Unavailable
information system is not reliable and its time and money consuming to
the organization. Infrastructure should also be reliable to avoid down
time of the network system, thereby rendering the organization at a
standstill (Collier, 2011).
Malfunction of infrastructure and systems on purely technical grounds
may hamper the aspect of security hence creating a deliberate harm to
the organization user (Kaushik, 2009).
Prevention vs. detection
Confidentiality, integrity and availability should focus on detection
and prevention so as to curb the insecurity issues.The technologies that
revolve around security has many instances that would depend on the
detection and prevention time frame, threats became an issue when it was
not realized at an earlier stage for it to be circumvented or mitigated
(Mirchandani, 2010).
Ethical aspect of users
Code of ethics bind the staff within an organization not to allow their
secrete codes to be exposed to the external environment including other
staffs. The server should be encoded in a way that it renews the
passwords of user at an interval of time (Kaushik, 2009).
To enhance better security, the security documents have to be kept in a
secure location so that the company would not be compromised from
within. Data source is the key analytical model that offers data mining,
data models being processes by all company and the client hence there is
great sharing of resources at press of a button. All these and more of
the similar elements are implemented in the cloud hence qualifies as
cloud analytics.
The organization should identify the infrastructure requirements to put
in place, to enable them manage the cloud technology hence more emphasis
should be laid on human capital and the platform development these
should be in line with the organization policies to guide the
implementation process. Cloud computing is facing new application
scenarios with the IDS approaches and tends to generate more problems
since the operator of the IDS should be the end user and not the
administrator of the cloud infrastructure. Therefore, we have an
obligation to secure our networks, as well as the curly, to avoid
intruders from accessing the systems database.
Appendix 1
Diagram1. The network structure
Collier, K. W. (2011). Agile Analytics: A Value-Driven Approach to
Business Intelligence and Data Warehousing . London- UK: Professional.
Kaushik, A. (2009). Web Analytics: An Hour a Day . New York- USA: John
Wiley & Sons.
Mirchandani, V. (2010). The New Polymath: Profiles in
Compound-Technology Innovations: Volume 2 of Wiley Professional Advisory
Services . New York-USA: John Wiley & Sons .
PC 1
PC 1