T H E M A G A Z... WINTER 2011 www.bcs.org/security

T H E M A G A Z I N E O F T H E
WINTER 2011
B C S
S E C U R I T Y
C O M M U N I T Y
www.bcs.org/security
DATA LOSS PREVENTION
How to stop your information from
slipping through your hands
06 ARE YOU A PLUMBER?
Data loss prevention techniques are not new and
it isn’t just about technology.
08 HUMAN FACTORS
Preventing data loss is more than about software,
it is about people too.
DATA LOSS PREVENTION
EDITORIAL
Henry Tucker Editor-in-Chief
Brian Runciman Publisher
PRODUCTION
Florence Leroy Production Manager
Keep in touch
Contributions are welcome for consideration.
Please email: [email protected]
Information Security MSc
Flexible learning for everyone
ISNOW is the quarterly magazine of
BCS Security Community, incorporating the
Information Security Specialist Group.
It can also be viewed online at:
www.bcs.org/isnow
The opinions expressed herein are not
necessarily those of BISL or the organisations
employing the authors.
© 2011 British Informatics Society Limited (BISL).
Registered charity no. 292786.
Copying: Permission to copy for educational
purposes only without fee all or part of this
material is granted provided that the copies are
not made or distributed for direct commercial
advantage; BISL copyright notice and the title of
the publication and its date appear; and notice is
given that copying is by permission of BISL.
To copy otherwise, or to republish, requires
specific permission from the publications
manager at the address below and may
­require a fee.
Printed in the UK by Interprint,
Swindon, Wiltshire.
ISSN 1752-2455. Volume 6, Part 1.
We have extended the way in which Royal Holloway’s internationally recognised MSc is offered.
• CPD/CPE Modules: Most MSc modules are now available as stand-alone courses of one week’s duration
(Block Mode). These modules may be taken with or without an examination.
As a result the MSc now has the following traditional delivery modes:
Full-time, one year, on campus; Part-time, two years, on campus; Block Mode, two years, on or off campus;
Distance Learning, up to four years via the Virtual Learning Environment.
The introduction of CPD modules has enabled us to introduce even more flexibility into our
methods of delivery.
• Latest innovation – ‘Mix and Match’ degree programmes. It is now possible to obtain the MSc by
accumulating modules by any delivery method listed above (maximum period seven years).
• Postgraduate Diploma – each module is also available in condensed mode and taught as a one, two or threeday training course offered by QCC Training Ltd. Students may follow a structured programme of these courses
and then undertake an MSc level project to obtain the Postgraduate Diploma in Information Security.
Information Security Group
www.isg.rhul.ac.uk
[email protected]
T: 01784 443101
British Informatics Society Limited
First Floor, Block D, North Star House,
North Star Avenue, Swindon, SN2 1FA, UK.
T +44 (0)1793 417 424
F +44 (0)1793 417 444
www.bcs.org/contactbcs
Incorporated by Royal Charter 1984.
05 ISSG PERSPECTIVE
Gareth Niblett, Chair of the ISSG, gives his view on data loss prevention.
06 ARE YOU A PLUMBER?
DLP may strike fear into the wallets of many, but it’s not that new.
08 HUMAN FACTORS
Preventing data loss is about more than just software, it’s about people too.
10 TAILORING TO THE THREAT
When dealing with DLP you need to
tailor your tools to your threats.
12 NOTHING NEW IN PCI DSS
If you break down the certification
there really isn’t much to be scared of.
14 TESTING TIMES
When it comes to securing data,
penetration tests can be the answer.
16 LEGAL
A look at how changes can improve
compliance with data protection laws.
18 OPINION
Trust, but verify is the motto of many security professionals, for good reason.
ISSG PERSPECTIVE
MAGIC
SOFTWARE
Gareth Niblett, Chair of the ISSG, says that many people see DLP software as something that
magically stops your data from being lost, whereas the reality is quite different.
– data loss is then also prevented as a
beneficial by-product.
To achieve robust inter-organisational
collaboration capabilities, we would need
common policies for identity proofing and
verification (IPV) of organisations, people
and devices, issuance of credentials,
authentication, authorisation so that
interoperability can be obtained (for
consistency and cost reasons).
Add a bridge, to tie together disparate
systems and organisations with cross
certification, along with an independent
verification process, to ensure assurance
is provided to all parties. Mix.
Enhance your professional skills
Browse our comprehensive range of business and IT books to help your
business and career development.
Telephone +44 (0) 1793 417 440 or email [email protected]
© BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2011
MTG/AD/1113/1011
www.bcs.org/careerbooks
DLP is often used as a catch-all term for
technology that somehow magically stops
all unauthorised information flows once it
has been installed.
In reality DLP has much wider
implications in an organisation and can be
quite nuanced. It could be considered as
part of information lifecycle management,
and should focus on ensuring the
organisation can share the information it
needs to, both internally and externally, in
a correct, accountable and secure manner
Levels of trust
One leading initiative, which I am involved
with, is working towards such federated trust,
at higher levels of assurance between
regulated companies and industries. The
British Business Federation Authority
(BBFA) http://federatedbusiness.org is a
not-for-profit self-regulating organisation
born out of a government request for a
body to represent the needs of UK industry
in relation to identity management, which
came at a joint BCS/eema seminar in 2009.
The BBFA Steering Group is made up
of companies from regulated industry
sectors and, along with its policy
management authorities, is working
with both private and public sector
organisations towards standardsbased and interoperable IPV, strong
authentication and authentication,
federation and PKI bridge policies,
procedures and mechanisms, as
it recognises that without these no
technology can meet the real needs of
customers and end users.
www.bcs-issg.org.uk
FURTHER INFORMATION
Information Risk Management and
Assurance Specialist Group:
www.bcs.org/groups/irma
BCS security portal:
www.bcs.org/security
ISNOW online:
www.bcs.org/forum/isnow
Winter 2011 ISNOW
05
DATA LOSS PREVENTION
your organisation. This might appear to be
very straightforward, but is it? Start with
the obvious data locations, the databases,
SharePoint sites, email servers and
corporate file shares. Is that enough? Even
When adding the permissions to the
information flows, do not forget to take
into account the type of access that is
required. ‘Read only, read/write, access to
data’ are all valid options but are dependent
IT managers can start to breathe a sigh of relief that
data leakage prevention is a mature area of
information management!
with thin client and virtual desktop
deployments today, the users will always
store information in the easiest place for
them to access it. Have you got a means
to search out all of the information that
exists and is relevant? This is the first
layer where technology can play a part.
You need a solution that can delve into the
darkest recesses of user accessible space
and deliver an accurate and timely report
on what it finds. You can now place the
information that has been located on the
network into the appropriate classification.
The people and processes that are
going to access the data need to be
defined. All the identities you grant access
to need to be unique so that you can
track who has accessed what and how
they used the information. An important
consideration at this stage is how you
identify these entities. Are you going to
rely on their active directory username
ARE YOU A
PLUMBER?
It would be a fair assertion that one of the largest
advertising campaigns around DLP was between
1939 and 1945 with the iconic propaganda posters.
Data leakage prevention (DLP) is part of the never ending task
that is information management. DLP is another one of those
acronyms that strikes fear into the hearts, and wallets, of IT
managers says Simon Kellow.
‘DLP – new and expensive technology and
difficult to set up, don’t think I’ll bother just
yet!’ This is a bizarre situation, because the
principles and underlying controls which
define DLP are far from new. When did
DLP really come into vogue? If we take a
cold, hard look at data confidentiality then
it would be a fair assertion that one of the
largest advertising campaigns around DLP
was between 1939 and 1945 with the now
inconic posters with the straplines:
Careless talk costs lives
and
Keep mum, she’s not so dumb
IT managers can start to breathe a sigh
of relief that data leakage prevention is a
06
ISNOW Winter 2011
mature area of information management.
Is DLP about technology? No, it most
certainly is not although that is not to say
that technology cannot be a part of a well
specified solution.
So what should be considered to
develop a sensible and effective DLP
solution, and be able to demonstrate to
senior management that they aren’t going
to become a tabloid headline? Let’s start
by considering the basics and imagine that
your organisation’s information is stored
within a box. This box is constructed from
many small blocks. Each block represents
an access path to the information within –
a username, a file-share, an IP mapping,
importance to the business, consequences
if it goes to the wrong place, and so on.
on the entity that requires access. Effectively
at this stage you have evaluated how
much you trust the staff and restricted
their access to the information that they
require to undertake their jobs. It is another
one of the basic principles of security and
information management that you have
now enacted – least privilege.
The final part of the information flow
diagram is to define the ways that the
information can be accessed. Green
classification data could be saved into the
corporate content management system
and be accessed from the internet via a
web browser. Amber might be restricted
to the corporate intranet, whereas red
classification is located on a network area
with restricted access. Where data can
be accessed by different means, e.g. web
browser and file share, then these are
two separate access paths and must be
treated as such.
Now remove a block to correspond with
each access path that you can identify but
be very thorough with no cheating. Look at
your new information store and see how
many blocks are still in place.
With this plumber’s nightmare potentially
leaking information before your eyes,
where do you begin to sort it out? Install a
DLP package quickly? Absolutely not.
What you need to create is an information
flow diagram. A good place to start is by
defining your information classifications,
but don’t forget to keep it simple. A traffic
light system would be a good analogy;
green for data that you will share with
anyone (the contents of your public facing
website?), amber for data that needs
controlling but isn’t business sensitive
(the internal phone list with employees’
corporate phone details?), and red for the
data that is business sensitive (merger
and acquisition plans, etc.) Just having the
classifications doesn’t mean a great deal
on its own and you need to define what
the classifications mean, and how they
affect the information that is associated
with them. However, at least you can now
define the parameters of the importance
of your data and what to secure, even if
we haven’t quite got to how it should be
secured yet.
Classifications
Now that your data classifications are
defined, you need to locate all of the
information that is used by, and belongs to
and password, or does the information
need a higher level of authentication? The
authentication requirements should be
layered onto the information flow diagram
as some methods may require other
access paths to be established.
Why have access?
You have now established who can access
the information, and what that information
comprises. Now you need to define the
why. This may involve grouping together
users or processes, and will further refine
the information flows. It is worth bearing in
mind at this point that the only
information that you will allow access to
freely is classified green. All other
classifications require a reason for the
access path to be justified; does Jeff in
logistics need access to the HR database?
Thinking back to the information box
of many blocks, how many blocks have
been removed now that you have revised
your access paths? If you have carried
out the steps methodically and thoroughly
then the only blocks that you have now
removed are the ones that absolutely must
be removed. You have also undertaken
an audit of your data leakage potential,
because you now understand where your
data is being accessed and processed and
by whom.
It is only now that you can reasonably
start to think about DLP technologies and
tackle that leak properly.
For more articles go online to:
www.bcs.org/articles
Winter 2011 ISNOW
07
DATA LOSS PREVENTION
HUMAN
FACTORS
Preventing data loss is about more than just software, it’s also about people says M. Angela Sasse from
University College London.
Over the past five years, news stories
about data loss have appeared with
depressing regularity. The cases have
occurred in both government and private
sector organisations. Major breaches in
the public sector have been investigated in
some detail - most notably the loss of two
CDs with child benefit data of 25 million
citizens by HMRC (by Kieran Poynter in
2007 1), and of an MoD laptop containing
600,000 people’s personal data (by Sir
Edmund Burton in 2008)2.
Both reports identify widespread use
of insecure methods of data storage and
transfer as the main problem. Based
08
ISNOW Winter 2011
on the insights from those reports,
recommendations for improving data
handling in government have been drawn
up3. The main recommendations are the
mandatory use of protective measures,
such as encryption, and that employees
should be held accountable for complying
with the policies that proscribe their use.
These measures may seem the logical
course of action for organisations trying
to prevent data loss. But given that these
protective measures were available when
the incidents occurred, one has to ask: why
they were not being used?
The Poynter report hints at the main
reason for this: employees were prioritising
‘other considerations above information
security risk concerns’. Contrary to the
view that some security professionals
have of ordinary users, those ‘other
considerations’ are not laziness or couldn’tcare-less attitudes - it is the organisation’s
main business.
Employees are evaluated on how well
they perform on their primary tasks, so
that is what their attention is focused on,
and what drives their decisions. Security
and safety are what human factors people
call secondary, or enabling tasks; if you
want them to be performed well, you have
to design them in a way that fits in users’
primary tasks.
Employees will not comply with
security measures that consume physical
or mental effort they need for their
primary tasks, disrupt them at a critical
point, or slow them down. I first pointed
this out in a manifesto paper on usable
security4, but I have seen little change
in the way that organisations implement
security measures. Mechanisms such as
authentication or access control are still
deployed with little or no concern of how
they fit within employees’ primary tasks,
and nobody seems to track what the total
workload associated with an increasing
number of policies and mechanisms is.
Over the past decade, I have carried
out many surveys with employees in both
private and public sector companies about
why they do not comply with security
policies; ‘because I cannot cope’ (for
instance, with the number and complexity
of passwords) and ‘because it gets in the
way of my work’ are the two most frequent
responses.
Leaving employees to resolve conflicting
demands of their primary work and
security has one predictable outcome:
focussed on getting their work done,
employees will find ways of bypassing
policies and mechanisms. Traditionally,
organisations have used sanctions and
security awareness campaigns to achieve
compliance, but if these have little longterm impact if the fundamental productivity
conflict remains.
So how might mandatory encryption
to prevent data loss get in the way of
employees’ tasks? In a study we carried
out as part of the Trust Economics project
in 2008 5, we found that the additional
time required to encrypt and decrypt
data is a concern, as is the physical effort
required to create, memorise, and enter
‘yet another’ password or passphrase
to decrypt files: ‘There is a cost to [USB
encryption] - it wastes my time.’
The perceived reduction in productivity
made some interviewees feel entitled
to break the policy, while others
acknowledged the risk to the organisation,
and the consequences that a breach
of confidentiality would have. A bigger
‘deterrent’ is the fear of not being able to
access data. In some scenarios, the fear of
losing data permanently is a concern.
‘I just feel a lack of control, because
most of the time the threat is unavailability
created by the encryption system.’
Several participants expressed a
fundamental unwillingness to rely on a
technology they did not understand.
‘I know very few people who run
encrypted file systems on a laptop...
because they don’t trust the file system.
They want their data to be accessible.’
In most scenarios, of course, there is no
real risk of permanent data loss, because
the data should still be available on the
organisation’s system. But in the context
of everyday business, there is a legitimate
fear of not being able to access the data for
critical tasks.
‘Can you imagine trying to do a presentation
– and then you can’t open the file?’
If a sales presentation cannot be shown,
both the employee and the organisation
risk immediate loss of opportunity,
longer-term embarrassment and impact
on their reputation.
‘You will forever be remembered as
the guy who blew the XYZ deal because
he couldn’t access the file to show the
projections.’
The government recommendations
suggest that organisations can achieve
better levels of compliance by developing
‘a culture that properly values, protects
and uses data.’
Communicating the value and
importance of data to employees
is important to influence their risk
perception in the right direction, but those
efforts will not be effective if the way it
is implemented gets in the way of the
organisation’s business and employees’
tasks.
Effective encryption must be
implemented and deployed in a
way that causes minimal delay in
access to data, and does not burden
employees with additional difficultto-use credentials (remembering and
typing long passphrases is a recipe
for non-compliance). Organisations
should consider where in their business
processes access to protected files will
occur, and consider real-time recovery
procedures when employees need access.
References
1 Kieran Poynter: Review of information
security at HM Revenue and Customs. HMRC
June 2008.
2 Sir Edmund Burton: Report into the Loss
of MoD Personal Data. April 2008.
3 Data Handling Procedures in Government:
Final Report. June 2008
4 M. A. Sasse, S.Brostoff, and D. Weirich :
Transforming the ‘weakest link’: a
human-computer interaction approach to
usable and effective security. BT Technology
Journal, Vol 19 (3) July 2001, pp. 122-131.
5 A. Beautement, M. A. Sasse and M.
Wonham, The Compliance Budget: Managing
security behaviour in organisations. In
Proceedings of the 2008 Workshop on New
Security Paradigms. ACM, Lake Tahoe,
California, USA. 47-58.
For more security articles go online to:
www.bcs.org/articles
Winter 2011 ISNOW
09
DATA LOSS PREVENTION
Information Assurance, CESG, defines
business impact levels for various sectors
of the UK to help quantify the impact that a
compromise of confidentiality, integrity, or
availability may have on an organisation1.
The threat profile for sensitive data is
multi-faceted, combining internal, external,
malicious and accidental factors and there
are many controls to consider.
TAILORING TO THE
THREAT
When it comes to dealing with preventing data loss, Tim Freestone says you need to match your
tools to the threats.
What would the impact be to your
organisation if it lost the personal details of
all of its customers or lost critical intellectual
property to a competitor? Most mature
companies have a disaster recovery plan to
handle the integrity and availability
10
ISNOW Winter 2011
concerns of data loss, but does the plan
cover confidentiality and how will your
business react if sensitive data leaks out?
Do you have security controls to minimise
this risk?
As for any information assurance
process, before putting in measures
to prevent data loss of this type, it is
important to assess the risk in terms
of threats and vulnerabilities, and the
impact should a data leakage event occur.
The UK National Technical Authority for
The accidental player
How often have you been drafting an
email, typed a name or part of a name into
the address box only for the mail client
to helpfully auto-complete the address
for you? The trouble is, we all know two
Johns: one who works for the company
and one who is external. Did you send the
latest company financial data or designs
for a new widget to the correct John? Data
leakage can be that simple and quick.
Well meaning, committed employees
often working beyond the call of duty may
email data to their home accounts so that
they can carry on working from home.
Unfortunately, their home computers don’t
fall under the company policy and so may
not be as well protected, thus increasing
the risk of leakage.
Although well implemented policies,
procedures and training can go a long way
to preventing this sort of inadvertent data
leakage, research into the actions of users
when handling information indicates that,
although they have knowledge of company
policies, they may actively ignore them to
get the job done when under pressure.
There are also some useful client
technologies and network-based DLP
products to help ensure the user is really
doing what they mean to do. Research into
user actions also shows that monitoring
may be more appropriate than barriers.
Users are less likely to consider the
implications of transferring data and,
instead, put their trust in technology.
Where such controls do not exist and
the responsibility, and potentially the
consequences, lie with the individual,
they are likely to make more considered
decisions.
The malicious protagonist
A more difficult problem to solve is the
intentional disclosure of information by an
employee. Although controls can be put in
place, a determined individual will find a
way to get data out. Can you stop someone
taking a photo with their own personal
How often have you been drafting an email, typed a
name or part of a name into the address box only for
the mail client to helpfully auto-complete the
address for you?
phone or other personal device and walking out of the door with it? Or copying files
onto removable media and posting it to a
public blog or wiki site?
Pre-employment and periodic employee
checks can help, as can traditional
security concepts such as separation of
duties. Technical solutions that allow deep
inspection of data before it is transferred
out of an organisation are starting to get
more comprehensive, but have a complex
task coping with the multiple different
ways of hiding data inside legitimate
document formats. For example, data
can be hidden in common file types such
as Microsoft Office in numerous ways
including unused space in the document
format, in metadata, in hidden cells, inside
images (steganography), in embedded
objects or simply by using the same
background and foreground text colour.
The implementation of a security labelling
policy to protectively mark information can
help prevent accidental data leakage, but
someone who wants to deliberately get
information out could just cut and paste
the text into a new document without the
appropriate label. Fortunately, technology
is now emerging to track documents that
look like previously known sensitive ones
and this could go some way to mitigating
that risk.
The hostile factor
In many ways the extent of the outsider
threat is not fully known. E-crime is on the
increase and it is usually public knowledge
when customer data has been stolen. We
do not, however, always find out when
other sensitive data has been leaked from
an organisation, for example
intellectual property. Often it is not in the
best interests of an organisation to divulge
the details of such an attack.
Advanced persistent threats (APTs)
are targeted at specific individuals or
organisations using multiple forms of
attack. These often start with social
engineering, for example spear phishing,
to manipulate a well meaning insider to
inadvertently provide them with assistance
in laying the ground work for a security
attack. The advanced nature of the threat
comes from the fact that the attackers
may have considerable multi-faceted
resources at their disposal. This provides
the opportunity to exploit vulnerabilities
in systems that may not yet be publicly
known (zero-day attacks) and so bypasses
signature-based malware checkers.
If successful, the attacker remains
persistent, below the radar of detection on
the network, enabling them to exfiltrate
data at will.
Training in defence against social
engineering attacks is vital, although the
attacks are becoming so targeted and
believable that even the most diligent of
users can be tricked. Alongside standard
technical controls such as firewalls and
anti-virus software, preventing data
leakage from the outsider threat requires
an awareness of the current state of
the networks. Intrusion detection and
prevention monitoring tools can help to
identify when an attack is taking place
as can deep inspection of data leaving
the organisation. Finally segregation of
networks can be used to manage data
flows into a secure network allowing
the most sensitive data to be isolated,
reducing the leakage risk.
With these different threat factors and
methods in existence, putting in place
an effective plan to handle data leakage
is paramount in order to secure the
confidentiality of key information in an
organisation. A single solution is unlikely
to be able to mitigate all threats and
multiple controls are likely to be needed
based on the threats to and vulnerabilities
of existing systems and the impact that a
loss might have on an organisation.
Reference:
1. http://www.cesg.gov.uk/policy_technologies/policy/media/business_impact_
tables.pdf
For more articles go online to:
www.bcs.org/security
Winter 2011 ISNOW
11
DATA LOSS PREVENTION
organisation with an internal governance
department has surely been plagued by
their internal and external IT auditors to
have these in place already?
The news, which may come as a relief
to some, is that most will be a long way
off complying with milestone one. In my
experience the problem is not technical
compliance, but the lack of robust
documentation managed under a robust
change control process.
The second milestone is about
protecting transmitted and stored
cardholder data. I am going to leave this for
the moment, but will return later.
Milestone three, titled ‘maintaining a
vulnerability program begins by asking
us to ensure we have anti-virus systems’
deployed. I personally disagree with this
description as I think it is misleading. In my
opinion it should be referred to as
anti-malware. Again most organisations
(unless Apple-centric) are a long way off
meeting this requirement. The milestone
then requires building and maintaining a
software development life cycle (SDLC).
This should encompass common industry
secure application development program
standards such as OWASP. As with
milestone one, my experience shows that
this is often in place, but is let down by
poor documentation and change/release
processes.
NOTHING NEW IN
PCI DSS
The past couple of years have seen the emergence of the Payment Card Industry Data Security Standard
(PCI DSS). Much has been written about it, much has been spent on it and many myths surround its scope,
content, legal position and effectiveness, says James McDougall MBCS CITP CISA.
The PCI standard was developed by a
consortium of credit card issuers,
including MasterCard and Visa, to provide
best practice for securing IT systems and
establishing processes for the use,
storage, and transmission of credit card
data in electronic commerce.
There is nothing new in PCI DSS
Controversial statement? I believe not. Let’s
look at the six core PCI DSS milestones,
which should be familiar to anyone who
has a modicum of IT security knowledge.
Note: each milestone has two requirements,
giving 12 in total.
1. build and maintain a secure
network;
2. protect (cardholder) data;
12
ISNOW Winter 2011
3. maintain a vulnerability program;
4. implement strong access control
measures;
5. regularly monitor and test
networks;
6. maintain an information security
policy.
My apologies for adding the brackets
around (cardholder) in milestone 2. If you
substitute this for the words ‘anything
sensitive and potentially damaging to the
organisation’ you would have a standard
DLP /information security framework.
Nothing more, nothing less!
Taking this to its logical conclusion
I would argue that, as most large
organisations report having robust IT
governance, then achieving PCI DSS
certification should be a simple exercise.
Why then is so much money
being spent on PCI DSS solutions and
consultants? There are only three possible
answers:
1. No one understands PCI DSS and
organisations are frightened of
the repercussions of not being
compliant.
2. Here are a lot of organisations who
do not currently have good DLP/
information security systems in
place.
3. PCI DSS is seen as a business
requirement so astute CIOs are
therefore able to leverage additional
budget for solutions they would
always have liked to implement but
that have remained on their wish
list.
Maybe the truth lies somewhere in
the middle, which is excellent news for
consultants like myself who are currently
making a living from the standard. So
let’s consider the six main categories that
make up the standard and where the
mystique emanates from.
Milestones
The first milestone goal is to secure
the routing of data into and out of the
organisation by building and maintaining a
secure network.
This is to be achieved by the installation
and management of effective firewalls and
by removing vendor default accounts. Any
Restricting availability
The basic tenant of milestone four is about
restricting availability to only those who
should have access to the cardholder data.
This once again should be in place due to it
being a staple mantra of the IT auditor CIA,
data confidentiality, integrity and
accessibility.
Milestone five begins by discussing
the physical restriction of access to data,
the bricks and mortar protective layer
(personally I think that this sits more
naturally in milestone four). The milestone
is titled ‘regularly test and monitor
networks’, which asks the organisation to
ensure it captures, monitors and maintains
information about access or attempted
access to the actual credit card data. This
is an area where the astute CIO is often
able to illicit additional budget to purchase
third party network monitoring tools, file
integrity systems and intrusion detection
devices.
Milestone six tasks the organisation
to ensure that it has regular application
and penetration testing programs. The
milestone then finally asks that everything
is covered and enshrined in policies and
procedures.
I think you will agree that there is
nothing difficult here; there is nothing new
in the standard?
If you have not started on your PCI
DSS path, I am sure you would now
be comfortable to tell your executive
management that it will be easy, that most
of it is in place, that the organisation might
even actually already be compliant. Why
then is there such an issue in achieving
compliance?
Let me come back to Milestone two. To
me this is the fundamental reason why
there is so much difficulty in achieving the
standard. Applications that are used to
capture credit card data are intrinsically
linked to a plethora of ancillary systems
and tools that manage everything
from credit control, refunds, recurring
payments, loyalty schemes and partner
organisations.
The standard solutions of removing,
tokenising or encrypting the cardholder
data at first seems impossible. The costs
in both financial and resource terms
seems prohibitive.
Politcal constraints
This is where the external consultant or a
good QSA can earn their fee. Free from the
internal political constraints, their ability
to see through this fog of war and to make
some bold recommendations can save
the organisation some major headaches. I
use the term fog of war deliberately as PCI
DSS is often a battleground between IT,
finance, HR, facilities management, sales
and marketing, learning and
development, credit control and a multitude
of others. Control and spending of budgets,
changes to long-established processes,
use of scarce departmental resources all
contribute to a war of attrition that results
in many, if not most organisations overcomplicating the project.
Remember PCI DSS is nothing new.
It is basic data loss prevention, which is
something I am sure the CEO thought the
organisation already had in place.
For more articles go online to:
www.bcs.org/articles
Winter 2011 ISNOW
13
DATA LOSS PREVENTION
TESTING
of respect for the CIO as they put aside
concerns about us identifying any potential
failings that one could argue should have
already been mitigated.
I have experienced many memorable
events in my career. First there was the
IT disaster recovery project (due to a fire).
The good news was the existence of a
TIMES
present in the software on the client’s
hosts, some significant in their impact,
they were able to gain administrative
rights to every one of these hosts,
both servers and workstations. They
then retrieved the password files and
ascertained all the usernames and
passwords.
The tester demonstrated that this was not only
possible from the client’s office location, but also
from client’s shop premises using the ePOS terminal
present in the shops.
disaster recovery plan, but the bad news
was that it was in the computer room that
was reduced to a pile of ash. Then there
was the time I was helicoptered out to a
ship in Mediterranean to look at weapon
system control; sent by train during the
Cold War era across East Germany to
Berlin and having an illegal sight-seeing
trip over to East Berlin. All of these events
were memorable as was speaking to
business heads in Khartoum recently
about IT controls. However, nothing could
have prepared me for the findings of the
pen test and the speed at which they were
produced.
The selected security firm had no prior
knowledge of the infrastructure or systems
Armed with this information they
then had full access to both the office
environment and the hosts located in all of
the stores, including both the management
workstations at the stores and the ePOS
terminals. The tester demonstrated that
this was not only possible from the client’s
office location, but also from the client’s
shop premises using the ePOS terminal
present in the shops. They were also able
to access a significant number of credit
card details. Although this assignment
was planned to last for the week, the firm
that we used was able to identify these
weaknesses in less than four hours.
The report that was produced
included details on solutions to all of the
The main findings came as a result of the tester
gaining unauthorised access to the client’s site (with
written authority from the CEO in their pocket).
that were in use at the client’s premises
and they commenced the assignment
reviewing the website. The findings for that
part of the assignment were not significant
as no business transactions were handled
online, but concerns were raised about
potential reputational damage and general
access to the site.
George Thomas explains how a good penetration test can help when it comes to data loss prevention.
Earlier this year one of my retail clients
asked me to assist in achieving PCI
certification. They were of the opinion that
they could achieve this with little effort as
they had already outsourced most of the
relevant processes. They were told this by
a company that was looking to sell them
new hardware and systems. I suggested
that the first step should be to undertake
a penetration test. They agreed with the
proposal. I thought this was a very positive
move and also a brave one given their
previous work on IT security. Having had
14
ISNOW Winter 2011
some visibility of how these assignments
were undertaken, I set about producing a
plan for its completion. Knowing my own
limitations, the plan was based on bringing
in a third party specialist firm.
I spoke with a number of firms, who
suggested a variety of options for my
client. One of the difficulties that I initially
faced was being able to obtain any client
references for work previously undertaken.
The work that the pen test teams do is
quite obviously commercial in confidence
and I do not know of many CEOs who
would be happy to have their system
weaknesses publicised. My solution to
this was to use my network of friends and
previous colleagues to reduce the shortlist
to those who had done similar work
elsewhere.
I agreed the scope of the assignment
with the CEO; it focussed on the externally
hosted internet-facing website, office
servers, workstations and infrastructure
located at the client’s premises and the
ePOS terminals at their many stores.
The objectives of the work were
to identify, assess and report on the
vulnerabilities presented by the client’s IT.
This would enable the client to understand
whether there were weaknesses in the IT
security that could be exploited by external
parties or an insider to compromise
information stored and processed by these
systems or the underlying infrastructure.
It was agreed that we should include
the CIO in the plan. I had concerns about
this, but they were allayed by the honesty
and openness of the CIO. During the
assignment I had tremendous amount
Written authority
The main findings came as a result of the
tester gaining unauthorised access to the
client’s site (with written authority from the
CEO in their pocket). They then connected
a laptop to a network port provided on the
ground floor of the client’s offices. With this
connectivity and zero knowledge of either
network configuration or usernames and
passwords, the tester was able to discover
and map all the client’s information systems.
Because of numerous vulnerabilities
weaknesses and the investigating firm
was quite clear to state that they had no
interest in doing the remedial work. This
was made clear at the outset and was one
of the main reasons that the CEO and I
decided to use their services. Having been
a consultant for 20 years I am aware that
the conflict of interest in many consulting
assignments becomes a very muddy pool
of water to get into. Often you wish to help
clients, but you have to maintain your
independence whilst still providing facts
that are not always what clients want to
hear. The client in this case was given a
rude awakening, but in a very professional
manner. A big pat on the back to the IT
security industry, a first class job done
with tact and discretion.
For more articles go online to:
www.bcs.org/articles
Winter 2011 ISNOW
15
LEGAL
REVIEWS
the accountability of the responsible data
controller. Without it, an organisation will
struggle to prove that staff are aware of
the appropriate procedures.
In addition, what is appropriate for
any data controller changes as time
passes and the needs and practices of
the business change. Data controllers
should review and update their security
measures and data protection policy on a
regular basis to ensure they are up-to-date
and accurately reflect how individuals
should be dealing with personal data
on a practical day-to-day basis and the
consequences for failing to do so.
KEEP IT
SIMPLE
Charlotte Walker-Osborn, Partner, and Dave Hughes, Solicitor,
from Eversheds LLP, recap on some key ways to improve
compliance with UK data protection laws.
The obligation to ensure that personal data
is held securely, through ‘appropriate
technical and organisational’ means,
has been in set out in the Seventh Data
Protection Principle of the Data Protection
Act 1998 (DPA) for over 11 years. However,
the news releases section of the website of
the Information Commissioner’s Office (ICO)
(www.ico.gov.uk) continues to provide regular
examples of data loss, such as:
•
•
•
Walsall residents’ details dumped in
skip - (09/09/11);
student at Manchester hospital lost
patient details - (07/09/11);
children’s case files found in
second-hand furniture shop (02/09/11).
Whilst damage to reputation is a
common result of security breaches,
the ICO also has fining powers (up to
£500,000) where serious breaches of the
DPA occur. No custodial sentences are
currently available, but with the Information
Commissioner stating this month that ‘it
beggars belief’ that such sentences are not
16
ISNOW Winter 2011
available, it seems the question is when
and not if this position will change.
Improving compliance: key issues
The incidents of data loss highlighted
above are indicative of common issues
which arise across both the public and
private sectors. We have set out below
some simple steps that could have
helped avoid these recent incidents.
These are often overlooked as shown
above.
1) Active compliance
It is a requirement that every organisation
processing personal information for its
own purposes (a data controller) has
appropriate procedures in place to keep
that information secure. However, it is not
enough to have robust security measures,
without a data protection policy providing
guidance on the implementation of those
security measures. A data protection
policy itself is not a legal requirement,
but this is the first document the ICO will
ask to see when an incidence of data loss
occurs and it will be used to evaluate
2) Contracts with third party service
providers
It is also a requirement that if a data
controller provides personal information
to a service provider (e.g. payroll) using
personal data on its behalf, it must have a
contract in place with that service provider
containing the mandatory provisions as
a minimum. Organisations should check
their external contracts to ensure the
requirements of the DPA are being met.
A contract of this type is also the perfect
place to contractually oblige the recipient
to comply with the data protection policy of
the data controller, as the data controller is
legally responsible for any breaches of the
DPA caused by the recipient.
3) Awareness training
Having a good policy and effective
contracts will offer few practical benefits,
if staff, agents and sub-contractors are not
provided with awareness training to develop
their understanding.
An organisation must be able to show
both lawful and effective procedures and
awareness of staff on how to properly
implement those procedures.
By taking just these small steps an
organisation can improve and maintain
standards of data processing carried out
on its behalf (by staff, agents and
sub-contractors) and reduce the risk of
data loss (and, importantly, any other
breach of the DPA) occurring in the future.
Please note that the information provided
above is for general information purposes
only and should not be relied upon as a
detailed legal source.
www.bcs.org/legal
CISSP: Certified Information
Systems Security
Professional Study Guide
(5th ed)
James M. Stewart, Ed Tittel,
Mike Chapple
Wiley
ISBN 978-0-470-94498-1
£39.99
7/10
Clearly the review criteria
for such a book are
different to others: this has to
be judged on how well it prepares the audience for the CISSP certification.
This is a huge book with a wealth of
learning resources all packed into about
800 pages. The breadth of topics covered
here effectively addresses the necessary
curriculum.
This book offers a structured learning
process, as it should, with a set of review
Surveillance or Security?
The Risks Posed by New
Wiretapping Technologies
Susan Landau
MIT Press
ISBN 978-0-262-01530-1
£20.95
7/10
The subject of this book is
most relevant in our times
given the increasing nature
of surveillance and privacy
concerns being raised over increasingly
diverse and pervasive media.
The author’s argument could not be
more compelling as she argues that design
and tools increasingly being designed
for one purposes have the potential to be
abused for a different purpose.
Practical problems
This book is a series of essays, one
following from another, tackling different
subjects and building up towards various
conceptual and practical problems that are
faced in protecting modern communications.
Chapter 2 provides a very valuable start
as it gives a brief yet insightful history
of communications and how they have
questions at the end of
each of its 19 chapters;
each chapter also starts
with a summary of topics
covered.
The level of diagrams
provided aid in the
clarity of the concepts
conveyed. I am afraid I
would have to criticise
the lack of subnumbering
of headings, which would
have been immensely
helpful for the purposes of
cross-referencing.
I do find the text of the book highly
readable and accessible; language is
important here for a readership coming
from a variety of backgrounds and it is
well-served. At less than £40, the book
provides good value.
Siraj A. Shaikh MBCS CITP CSci
evolved over the years.
The following chapters
deal with the nature
of threats that modern
networks face and the
relevant legal, political
and policy challenges
that arise.
On page 195, the
author mentions Murphy’s
Law, ‘Anything that can
go wrong will go wrong’,
to emphasise that the
case is no different with
communication interception. Though
hidden away, this is the central message of
the book.
I find the book highly readable. The
simplicity of the message is conveyed
clearly. The wealth of references,
compiled at the end of the book, is a very
valuable resource for any researcher or
technologist interested in this area.
I recommend this read to all students
of network security. I also find the
language and use of terminology generally
accessible throughout the book.
Gray Hat
Hacking: The
Ethical Hacker’s
Handbook
(3rd ed)
Allen Harper et al
McGraw-Hill
ISBN 978-0-07174255-9
£44.99
10/10
The handbook covers all aspects of
ethical hacking, including the
non-technical areas, with chapters
on social engineering and physical
security.
There are also a couple of sections
on the soft skills involved in security
auditing, which cover managing a
penetration test, including managing
user expectations and not exceeding
legal and contractual limits.
Security auditing and penetration
testing relies heavily on tools that
simplify the exploitation of common
issues and the network exploitation
chapters discuss their usage. The
authors have wisely chosen to
concentrate on Cain and MetaSploit,
two tools noted for their effectiveness,
but which can also take some getting
used to.
Further chapters include an
introduction to coding, both to
help the reader understand later
chapters on developing exploits and
analysing malware and to promote
the important concept that a security
consultant who is able to code can
operate much more effectively and
with greater understanding.
A good overview of the testing and
auditing SCADA systems is provided
and it was refreshing to see some
space devoted to an area that is not
widely understood or discussed.
Such a wide range of coverage
means that some areas, such as
web applications, get a less detailed
treatment. Having said that, the book
remains the definitive introduction to
penetration testing.
Siraj A. Shaikh MBCS CITP CSci
Nick Dunn
Winter 2011 ISNOW
17
OPINION
Develop your
IT workforce
without disrupting
the working day
WHO DO YOU
TRUST?
Rather like Asimov’s three laws of robotics, the information security specialist has
‘confidentiality, integrity, availability and compliance’, says John Mitchell.
Confidentiality is all about ensuring that
only those people who should have access,
have so and compliance relates to the need
to meet the associated regulatory
framework. So losing data is likely to break
the first and fourth rules of security.
However, as the data is usually copied
it is not ‘lost’ in the accepted sense, but
rather distributed to a wider audience than
intended with the original owner none-thewiser to the breach until something else
happens. This may be a whistle blower,
blackmail, or the use of the data for
another purpose which then puts it, either
accidently (emailed to the wrong person)
or deliberately (wiki-leaks) into the public
domain.
In the UK there is a statutory
responsibility to report breaches to the
Information Commissioner and a search
of the ICO’s website revealed that in the
period from November 2007 to May 2010
1,000 breaches were reported. Since April
2010 the ICO can order organisations to
pay up to £500,000 as a penalty for serious
breaches of the Data Protection Act.
Research by Gordon & Loeb indicates that
a small increase in security expenditure
provides an exponential gain in the level
of security. So security officers should be
able to justify such expenditure with a risk
18
analysis of the consequences of data loss,
with a half million pounds fine being one
of them.
However, in my experience the so called
controls that are meant to reduce either
the likelihood, or consequence do no such
thing and even where they do they usually
only relate to the risks being managed by
the CIO. Once the data is transferred to
the end-user, then most of the controls
implemented by the CIO’s office become
redundant. Research by a major European
bank (which must remain anonymous)
showed that two-thirds of IT expenditure
was outside of the CIO’s domain and
therefore effectively outside the control of
the CSO (who usually reports to the CIO).
So it is not too surprising that
many data losses are caused through
negligence of the end-user rather than
by sophisticated external attacks. Trust
is not a control mechanism, but rather
a lazy approach to security. Both Nick
Leason (Baring’s Bank) and John Rusnak
(Allied Irish Bank) were trusted individuals
and yet the first managed to destroy a
bank whilst the second made the bank’s
management look foolish. Multiply the
potential risk of malpractice by the
number of end-users who have access to
sensitive data and you begin to appreciate
why lazy senior management prefer to
repeat the ‘you have to trust someone
mantra’ whenever they are embarrassed
by a data loss. I would not mind if I could
at least see the risk clearly identified on
the risk register with the ‘tolerate’ decision
box ticked to indicate that they have
made a management decision to tolerate
the loss. I have never seen this because
it would embarrass them to have it so
clearly recorded. Much better to come
out with the trust paradigm after the loss
has occurred. This, of course, is really the
fault of the timid risk manager who is too
frightened of senior management to tell
them of the omission.
We auditors have our motto of ‘trust but
verify’, which means that we don’t trust
you until we have verified that the controls
really do manage the risk. Sadly, they
are often woefully deficient even for the
identified risks, but are totally absent if the
risk has not been indentified at all.
Reference:
www.rhsmith.umd.edu/faculty/lgordon/
Gordon%20Loeb%20Model%20cybersecurity.htm
For additional articles please visit:
www.bcs.org/articles
Our professional development programmes can give
your organisation a competitive edge and your employees
the relevant practical, technical and managerial expertise
they need to work in today’s constantly changing global
IT & Telecoms environment.
Solutions range from IT professional practice,
enterprise software development, information security
management, systems integration, computer forensics
and project management, to awards in IT business and
management including our triple accredited MBA.
Your employees can study outside of working hours
using the latest learning technologies alongside ongoing
support from us and what they learn one day can be
applied the next.
Did you know?
• Our specialist programmes are developed by experts
in association with professional bodies, sector skills
councils, IT vendors and IT & Telecoms employers
• We’re the largest and fastest growing Cisco Academy
in the UK and among the top 5 universities for computer
science
• Our triple accredited business school is in the world’s
top 1%.
Develop your workforce
www.openuniversity.co.uk/it
[email protected]
0845 758 5097
Quote: ZANAAA
INSPIRING LEARNING
ISNOW Winter 2011
The Open University is incorporated by Royal Charter (RC 000391), an exempt charity in England and Wales and a charity registered in Scotland (SC 038302).