Ahead in the cloud: how to keep your data secure Ahead

TODAY’S CIO
Spring/Summer 2015
Do I really need
to worry about
data centres?
Ahead in the cloud:
how to keep your
data secure
TODAY’S CIO
Spring/Summer 2015
Do I really need
to worry about
data centres?
Ahead in the cloud:
how to keep your
data secure
Sponsored by
Publisher
Tim Murphy
Operations Director
Tony McGuin
Finance Director
John O’Shea
Sales Team
Alvin Campbell, Graham Gallant,
Patrick Johns, Karan Kohli, Mark Phillips
Production
Rob Lowry
Rhodri Hughes
©Sparta Publishing Ltd, April 2015. All rights reserved. No part of this
publication may be used, reproduced, stored in an information retrieval
system or transmitted in any manner whatsoever without the express
written permission of Sparta Publishing Ltd. All of the articles in this
publication have been supplied by contributors, and the publisher cannot
give any warranty, express or implied, as to the accuracy of the articles,
or for any errors, omissions or mis-statements, negligent or otherwise,
relating thereto. Accordingly, the publishers shall not be liable for any
direct, indirect or consequential loss or damage suffered by any person
as a result of relying on any statement in or omission from these articles.
Opinions expressed in articles are not necessarily the opinions of the
publisher.
www.spartapublishing.co.uk
Tel: +44 (0)20 7970 5690
40 Bowling Green Lane, London EC1R 0NE
Contents
Foreword
7
Is the role of the CIO expanding into other fields?
Sparta Publishing Ltd
Security
25
Big Data
28
9
Do I really need to worry about data centres?
Simon Campbell-Whyte, Data Centre Alliance
12
Clarity in a data driven world
Gavin Joliffe, Xtravirt
17
Using end-user IT analytics for better
business decisions
Poul Nielsen, Nexthink
21
Cloud migration is the leading concern
of cloud adopters
Datapipe
Network security is in the details –
the destruction of the Death Star
Ian Whiting, Titania
Privilege gone wild: the state of privileged
account management in 2015
BeyondTrust
A brave new world: understanding the
new EU data protection regime
Jonathan Armstrong, Absolute Software and Cordery
37
Ahead in the cloud: how to keep your data secure
Kevin McLoughlin, Wolters Kluwer
39
The five pillars of the Cyber Essentials Scheme
David Parkinson, in association with the Wick Hill Group
41
The security challenge of the mobile workforce
Samik Halai, Integral Memory
35
Today’s CIO
3
Contents
Technology & Innovation
Testing, quality and the quest to meet
customer expectations
Darren Coupland, Sogeti UK
45
The CIO route to business hero
Derek Britton, Micro Focus
49
Wireless “induction” charging –
the definitive guide
CMD Ltd
53
IT monitoring and business productivity
Dirk Paessler, Paessler AG
43
Learning & Development
57
60
Does risk matter? Disengaging from a
‘self evidently correct’ process
Dr Elmar Kutsch & Dr Neil Turner,
Cranfield School of Management
Advertisers’ Index
Today’s CIO
5
Foreword
Is the role of the CIO
expanding into other fields?
THE ROLE OF ANY SENIOR EXECUTIVE, ESPECIALLY AT
board level, in todays working environment is changing
rapidly and nowhere is this more transparent than in
the ever-expanding role of a company CIO. It’s no longer
satisfactory for the CIO to concern themselves with only
matters of IT, it is now crucial that they are developing their
skills in many other areas. Finance, security, education, as
well as research and development, are all areas that a board
will expect a CIO to have knowledge of and have the full
facts on. In a commercial world where profit is key, the CIO
is now regarded by their fellow senior executives as being
the person with their finger on the button as to what new
developments can increase a company’s earning potential.
As each new technical advance replaces existing
technologies with increasing rapidity, the prospect for new
markets opening up are critical to business development.
The rise of big data, cloud and mobile technologies has
all contributed to this enhanced role for the CIO as the
reliance on their successful and uninterrupted working is
key for a company’s continued profitability. The use of big
data and the cloud are the main talking points for company
IT and who amongst us is not now reliant on their mobile
phone. The innovations of the last decade have been world
changing with developing technologies emerging rapidly
with each passing week or day. What was viewed as cutting
edge and new only a year ago can already look dated.
It’s important therefore that this expanding role of the
CIO is reflected in the pages of Today’s CIO. As big data
and the cloud increase in their importance then so does the
means by which companies are protected. In many cases
and for many companies it is now no longer acceptable to
have a Firewall protecting data, there has to be a Firewall
protecting a Firewall protecting a Firewall ad infinitum.
The security of data is now as important as the storage
of data so being aware of the pitfalls facing a company,
from both external and internal threats of compromised
security, also constitutes a major part of the CIO’s role.
As roles change and merge it is of increasing
significance that a CIO keeps updated across a
variety of skills and is always ready to face any new
challenges, as they will no doubt become important to
the continued success of a company and its CIO.
Today’s CIO
7
Data is everywhere.
But it’s what you see in your data
that makes the difference.
What if you could easily visualise data from varied sources, find hidden
relationships within your data and uncover meaningful insights that can change
your business. Qlik makes it simple to do just that. Imagine what you’ll discover.
qlik.com
Big Data
Do I really need to worry
about data centres?
By Simon Campbell-Whyte,
Executive Director,
Data Centre Alliance
WELL THERE’S NO DOUBT THAT THAT
your organisation’s shareholders and
stakeholders probably hope you are.
But why? Before we examine that, it’s
perhaps not always clear what the term
“data centre” actually means or even
if you have one under your control to
worry about. The EU Commission and
National governments everywhere
are now defining “computer rooms”
and “server rooms” as “data centres”.
So to determine if this article
applies to you, a data centre could be
a multi-million pound facility the size
of a football pitch or perhaps more
likely, it could be a small dark room
in the basement or under the stairs
of your offices, got one or more of
those? If yes then please read on.
The chances are that that small
server room, or whatever you choose
to call it, contains the critical IT
equipment your organisation needs
to function smoothly and there’s been
quite a revolution going on behind
that closed door in recent years. Our
businesses and economies are now
completely dependent on data and
those that cannot access, manage and
secure data efficiently are the ones
who will soon fall by the wayside.
But you might say mine works okay so
why should I worry? You may be lucky
that data centre outages are rare in
your organisation. But one thing is for
sure – when (not if) it happens again,
there is no doubt the impact will be far
greater than the last time it happened.
So organisations really do need to
think of their data centre as probably
the most critical infrastructure, this
has crept up on us in recent times as
being the most important functions
of all businesses. This is why your
stakeholders and shareholders hope
you worry about your data centre, they
know your business will stop dead
the second your data centre does.
So what should a CIO or business
owner do? Protecting against
Today’s CIO
9
Big Data
the apocalypse of an outage can be firefight, so you
need to understand and manage the risk. It is not one
just for the technical department, but actually should
involve all departments as they are now ALL users of the
facility. So you need to be asking questions and seeking
advice. Understanding what the business’s requirements
and dependencies are whilst making sure your data
centre strategy is aligned to this is the first step.
Governments now have data centres on their radar.
Although Information and Communications Technology
(ICT) is undoubtedly an enabler of energy saving solutions,
there remains no section of society or business that is, or
should be, immune from scrutiny when it comes to energy
efficiency and environmental footprint and IT is no exception.
You may be surprised to know that your server room or
data centre is a significant energy consumer. In a typical
office environment, for example, up to 40% of the energy bill
will be consumed within the computer or server room. This
can quickly spiral out of control if not managed correctly,
servers idling in your server room all day and at night
when nobody is using them can run up huge energy bills.
Capacity shortages that can paralyse your organisation
are usually due to things running when nobody knows why
and whom needs them running. So you should ensure this
is measured and monitored as an absolute minimum, if
you don’t you are flying completely blind. Legislation is on
its way to force organisations to deal with this issue, but
organisations can take action to follow best practices that
will save money, energy and increase reliability right now
and negate the need for legislation, schemes such as the
EU Code of Conduct for Data Centre Energy Efficiency
should now be adopted, the alternative will probably
be far more complex, time consuming and costly.
Fast evolving technology and the sheer volumes of data
we now have to cope with are challenging enough, but one
mustn’t lose sight of the human element, nearly all data
centre problems can be tracked to people, so you need to
ensure your people are equipped with the right training
10
Today’s CIO
and skills backed by a clear strategy. Security is another
area that should begin in the data centre, as the stakes
of cybercrime continue to rise, it is critical to ensure your
data centre has the correct access control policies.
So OK now I understand I need to look at the data centre
I didn’t know I had, what is the Data Centre Alliance doing
that can help me? Well if you are not a data centre industry
company, or in the public sector you can join for free for a
start and get access to advice, information, best practices,
tools and resources. One tool we are developing, initially
for the public sector, called EURECA, provides a simple
questionnaire that identifies the energy and cost saving
opportunities within your data centre and recommends
both a self-help and procurement roadmap. We will be
launching the project in March this year, assisted by EU
Commission funding. We also publish regular articles and
advice in our monthly journal and run regular knowledge
sharing events. In summary, there is no doubt your “data
centre” is a sophisticated mission critical asset, the health
of which is inextricably linked to the future health of
your organisation so don’t delay, it is time to find out who
holds the key to that locked door in your basement.
Author information
The Data Centre Alliance is a not-for-profit association
comprising of leaders and experts from across the data
centre infrastructure sector. The DCA is committed to
developing activities and projects to further improve the
effectiveness of the data centres. The DCA also promotes
awareness of the role of data centres and innovations that
ensure the future growth and sustainability of the industry.
Membership is open to all operators of data centres,
however large or small, and organisations that supply the
industry with products and services.
Big Data
Clarity in a
data-driven world
As organisations embrace
virtualised and cloud-based
infrastructures, emerging
new tools help businesses
cut through the plethora
of data produced in their
environments to provide
actionable insights.
By Gavin Jolliffe,
CEO at Xtravirt
12
Today’s CIO
THERE ARE UNQUESTIONABLY
some real and present challenges
for IT organisations when making
use of the valuable data produced
in their environments.
Google’s CEO Eric Schmidt once
made a headline-grabbing statement:
“Every two days now we create as
much information as we did from the
dawn of civilization up until 2003.”1
That’s something like five exabytes
of data – quite impressive. From
IBM and along similar lines: “90%
of all data in the world was created
in the past 2 years alone.”2
These are big statistics, and arguably
a chunk of it may be no more than
‘fumes’. However, the raw sentiment
is simply a reflection of some of the
driving change being observed in IT
today, and helps to orientate us to
either a challenge or opportunity.
Working in a business whose role is
to assist organisations to transform,
modernise and optimise their cloud
and workspace infrastructures, we
meet hundreds of clients of all sizes
and industry verticals, and experience
their challenges first hand.
To clarify, by infrastructure I mean
the core platforms on which IT services
are delivered to the business. This
could be a virtualised or private/hybrid
cloud platform, running their servers,
desktops, applications and all the parts
of the ‘technology stack’ in between.
Two areas that we naturally gravitate
towards when it comes to common
challenges are information and change.
Why? Information is used by the IT team
to make decisions, so it needs to be
as accurate as possible. And change,
because in the era of ‘software-defined’
and ‘IT-as-a-service’, change is the
constant, which comes with distinct
benefits and challenges. Fundamentally
it still comes back to having good
information, and the challenge is
how to effectively get hold of it.
“Most companies estimate they’re
analysing a mere 12% of the data
they have, according to a recent
study by Forrester Research. Is
this good or bad? Well, these firms
might be missing out on data-driven
insights hidden inside the 88% of
data they’re ignoring. Or perhaps
they’re wisely avoiding a resourcegobbling, boil-the-ocean strategy.”3
Why do I need to be informed?
There’s an age-old expression: “You
don’t know what you don’t know.” And
when your competitors out there are
bent on chipping away at your market
share, ignorance is definitely not bliss.
Some of the market challenges faced
by organisations within virtualisation
and cloud currently include:
1. How to gain early mover advantage
to creating agile infrastructure
in a changing landscape?
The move towards the softwaredefined technology operation
is underway with virtualisation
forming the foundation for
IT-as-a-service and Cloud.
2. Increasing convergence of IT
The progressive unification of
operations is driving a greater
need to have automated insight
across IT infrastructure, including
aspects such as management,
security, and compliance. This is
also driving changes in information
sharing and accountability
across different disciplines.
3. Increased pressure on time
IT management and operations
need information, but also need to
cut through information overload.
The aim is to simply obtain a quick
understanding of the operations
environment from a technical and
business perspective, and identify
what action needs to be taken
and who should be involved.
However, many organisations are
finding that a lack of analytics tools and
C
M
Y
CM
MY
CY
MY
K
Big Data
existing data silos are two reasons companies ignore a vast
majority of their own data, as well as the challenge that often
it’s hard to know which information is valuable and which is
best ignored.
It’s a modern world. This stuff is easy, right?
In our experience the pace of technology advancement
is making it easier to build and deploy smart, scalable
IT solutions but, once in place, the ability to sustain the
move beyond simply ‘keeping the lights on’ is arguably
becoming more challenging for many. Reasons include:
n Lack of resources: Many best practice management
procedures such as keeping up-to-date configuration
documentation are not routinely undertaken due to
lack of resources, thus increasing the risk of an issue
affecting performance and stability of the environment
n Administration burden: Analytics and standard reporting
are often considered a mundane task and therefore
there is a lack of motivation to complete tasks
n Information overload: Use of multiple tools across
infrastructure silos makes it difficult for IT teams
to quickly pinpoint important information
n Adapting to new environment: The pace of change
in a virtual or cloud environment stretches the
management capabilities of IT operations and their
ability to proactively detect hidden issues
So what are organisations doing today?
This may not fully reflect all ends of the market, but it’s not
uncommon to find elements of each of these equally present
in 500 seat organisations as well as 50,000 seat enterprises.
n Using free scripting tools: These are usually adequate
for summarising technical information but don’t tend to
present data in the context of management requirements.
Nor do they clearly articulate remediation steps, and
there are risks around the person who created it moving
on, in addition to a need for regular maintenance
n Running system and point monitoring tools: These are
good at providing a wealth of detailed information, but IT
can be left implementing and maintaining an expensive
piece of infrastructure, where just doing the basic set
up already takes a lot of time. Some tools produce a lot
of data which then has to be examined to understand
its value, followed by translating that information for
different parts of the technical team and for the business
management team, which can be time consuming
n Asking team to report manually: This is inherently
inefficient, and adding time consuming management tasks
to the IT team’s workload can be counter-productive
n Maintaining inadequate IT infrastructure policies:
The end result is not documenting environments,
or not running regular assessments on the health
of the virtual platforms, which all increase risk
Where is the future going?
Monitoring does not replace management, although
the two can overlap depending upon the context. There
are many great tools out there used to operate IT
infrastructures, but often there’s a blurring of the lines
between monitoring and management. On the one hand,
monitoring has its roots in a history which talks about RAG
consoles, and reacting to issues once they’ve occurred;
however, that’s not to say there haven’t been some great
developments in predictive analysis and self-learning/
healing to make the system administrator’s life easier.
On the other hand, management aims to be proactive,
looking for insights within technical data to aid cross-team
planning, project management, and business discussions
and reporting. These insights abstract and simplify the
information to drive clarity of decision-making, sustain
standards and good practices, avoid inconsistency,
reduce risk and improve service levels. In many cases, all
of this could reduce the need for some of the heavy duty
tools needed to keep the lights on in the first place.
The gap is undeniably there for many organisations, so
where next?
Over the past 5 years we’ve seen the rise of big data,
the debates about moving from data warehousing to
new services such as Hadoop, the impact on science and
quantum physics, and the rapidly increasing relevancy
of data scientists as a mainstream career choice.
However, we’re now also seeing an expansion to the
mass market with an emerging new breed of tools that
enable more accessible consumption of truly useful
information. These tools help IT address the challenges
of the new dynamics being presented by virtualisation
and cloud, and in many cases significant improvements
can be made just by doing the basics better.
References
1. (Source: http://techcrunch.com/2010/08/04/schmidt-data/)
2. (Source: http://www.ibm.com/smarterplanet/uk/en/
business_analytics/article/it_business_intelligence.html )
3. (Source: http://www.informationweek.com/big-data/big-data-analytics/10powerful-facts-about-big-data/d/d-id/1269522?image_number=2)
Author information
Gavin Jolliffe is a seasoned technology industry
entrepreneur and serial innovator with over 15 years
executive and operational experience. Launching Xtravirt
in 2007 he led its rapid and exponential growth through
innovative software development. Gavin has since built
an industry-leading consulting organisation which is
recognised for its expertise in software-defined data centre
and cloud solutions.
Xtravirt delivers cloud, workspace and data centre
transformational solutions to clients across public and
private sectors. Its consulting organisation is recognised
globally for astute management, sound methodology and
a proven track record, which provide unsurpassed value to
Xtravirt’s clients.
xtravirt.com
Today’s CIO
15
Delivering the action plan
for the data centre industry
Become a member today...
www.datacentrealliance.org
+44 (0) 845 8734587
www.datacentrealliance.org
Big Data
Using end-user IT analytics
for better business decisions
Poul Nielsen,
director of strategy,
Nexthink
BUSINESS IT INFRASTRUCTURES ARE
becoming increasingly complex, with
ever more devices and applications
being added on a daily basis due to
phenomena such as bring your own
device (BYOD) and the Internet of
Things. The technology decisionmakers charged with managing these
infrastructures are also being asked
to constantly improve the service
that they deliver to their end-users,
despite often having to rely on backend, silo-based monitoring systems
that may no longer be fit for purpose.
As a result, end-users find
themselves experiencing problems in
the day-to-day performance of their IT
systems and services. This can result
not only in a drop in productivity but
also in a concerning lack of faith in
their organisation’s IT support team.
Forrester Consulting has compiled
a report1 which profiled IT decisionmakers in the UK and US to evaluate
how the performance of their
organisations’ IT services impacted
their end-users and businesses.
The report’s key findings included
the concern that an organisation’s
IT service desk was usually the last
to know about any performance
problems affecting employees. With
back-end monitoring tools unable to
detect these performance issues,
end-users are either looking to
colleagues for help, or attempting
to solve the problems themselves.
The increasing complexity of IT
service portfolios, and the variety
of applications crossing multiple
technology silos, were found to
be complicating matters further,
and a need was recognised for
new tools and better analytics to
provide service teams with a view
of performance data over time
from an end-user’s perspective.
Complexity and the
need for visibility
According to the report, almost half of
IT decision-makers (43%) considered
the complexity of their IT service
portfolio and the lack of full, clear
visibility into IT performance from
the end-users’ perspective as two
of the most significant barriers to
delivering better service quality.
Increasing levels of complexity lead
to claims that IT support staff lack
adequate end-to-end visibility of any
IT services that span multiple parts
of their organisation’s infrastructure.
Indeed, more than a third (36%) of
respondents considered this to be
a significant barrier to providing
higher levels of service quality.
The survey revealed that IT support
staff were spending three quarters
of their time (73%) troubleshooting
issues such as these, relating to backend infrastructure, and particularly
those issues that involved multiple
aspects of the infrastructure.
Significantly, the survey reveals that
these multi-factor issues, with end-users
experiencing problems affecting several
Today’s CIO
17
Big Data
parts of their organisation’s IT infrastructure, were regularly
being encountered by nearly all IT decision-makers (98%).
The frequency of such issues can be viewed as being
responsible for a form of silo mentality, with 90 per cent of
respondents reporting coming across situations where they
were required to rule out issues specific to a particular device
from issues relating to the wider infrastructure. Respondents
claimed that they spent nearly as much time troubleshooting
problems with infrastructure as they did with problems
directly affecting end-user devices. As a result, 83 per cent
claimed that they found the situation made it difficult for
them to effectively diagnose issues around IT service quality.
It’s perhaps little wonder then that IT support can
often be perceived as being largely reactive in nature.
By way of illustration, the report shows that, in more
than a third of cases (35%), IT support teams were made
aware of a performance or availability issue only when
a ticket was raised by the end-user affected, rather
than being alerted by a back-end monitoring tool.
Whilst the technology does exist to provide insight
into business service availability and performance from
the perspective of the infrastructure, it’s unable to see
what individual end-users are actually experiencing. In
fact, only two per cent of IT decision-makers admitted to
currently employing any form of monitoring tool able to
detect downtime from an end-user’s perspective, and that
could offer insight into its possible effects on employee
productivity, customer satisfaction, and revenue.
End-users taking matters into their own hands
The report reveals that, when an end-user experiences an issue
with application performance, 50 per cent of the time they have
either had a bad experience with IT support in the past, don’t
think that their IT department will be able to solve it, or that IT
will send them somewhere else for assistance. With statistics
such as these, it’s not hard to understand why end-users may
have lost some degree of confidence in their IT support teams.
According to workforce data from Forrester, almost 45
per cent of employees will attempt to solve an IT issue on
their own before they request assistance from the helpdesk.
The same survey reveals that 17 per cent of workers claimed
that they didn’t believe that their IT support team was
able to resolve their performance issues or answer their
technical questions. A further 16 per cent said that they
turned to search engines to find an answer for themselves,
while another 14 per cent admitted to calling the software
or hardware manufacturer’s helpline for assistance.
Further evidence of this perceived lack of faith in
IT support teams can be found in the statistic that
almost a third (30%) of an organisation’s employees
will seek help and advice from colleagues and peers
who aren’t employed in any IT-related function.
Making sophisticated analytics a priority
To combat this lack of faith, and to allow them to take a
significantly more pro-active approach, IT support desks
require more sophisticated monitoring and analytics
tools that will enable them to detect performance and
18
Today’s CIO
service issues experienced by their end-users – often
before the users themselves are aware of them.
Indeed, Forrester’s Forrsights Priorities and Journey Survey
2014 reveals that, when questioned on their priorities for
the next 12 months, more than half of the IT decision-makers
surveyed (52%) stated a need to improve the use of data
and analytics for better business decisions and outcomes.
Additionally, a third of respondents (32%) recognised
the importance of connecting the products and assets
across their technical estate so that they would be in a
better position to monitor and analyse their identity,
location and condition. This visibility, which end-user
analytics is capable of providing, will then enable an
organisation’s IT support staff to identify, understand
and pro-actively address issues at the end-user level.
Making an investment in end-user analytics will enable
an organisation’s IT department to begin improving
the quality and performance of the service it delivers.
IT analytics should be a top priority for IT leaders,
providing the missing component for end-to-end visibility
to proactively make better business decisions.
By routinely monitoring the IT infrastructure,
automatically generating meaningful reports, and
identifying and resolving issues the moment that they
occur, analytics such as these will deliver actionable
insight into whether the quality and performance of the
service meets the needs of the organisation’s end-users.
Ultimately, by applying this insight, IT teams will
become more effective and efficient in delivering projects
and allocating the budget necessary to improve those
issues with service quality that are likely to have the
greatest effect on employee productivity, customer
satisfaction, and the organisation’s bottom line.
Reference
1. Report commissioned by Nexthink.
Further information
About Nexthink
See your IT Infrastructure as never before with Nexthink
End-user IT Analytics for security, ITSM and workplace
transformation. Our software uniquely provides IT
analytics covering all endpoints, all users, all applications
and all connections, all the time, in real-time and new
visibility into your IT infrastructure and service delivery.
Nexthink helps IT Departments connect, communicate and
collaborate to achieve their major goals and to optimize
endpoint security, operations, support, and workplace
transformation projects. Nexthink real-time analytics
and visualization extend help desk, server monitoring,
APM (application performance management) and PCLM
(PC lifecycle management) tools and provides essential
visibility for IT governance.
To learn more visit www.nexthink.com
+44 753-946 6702
Big Data
Cloud migration is the leading
concern of cloud adopters
Datapipe recently surveyed IT
infrastructure specialists and CIO’s
asking them to highlight their biggest
concerns about adopting cloud IT
solutions for their business
The number one concern cited was
the difficulty of cloud migration
Migrating to the cloud can indeed be difficult if you
don’t have the right partner. By enlisting the help of
an experienced and reliable third-party cloud services
provider, companies can migrate to the cloud quickly
and avoid roadblocks. Find a “managed service provider”
who is equipped to manage complex migration needs
and execute a seamless transition to a cloud or hybrid IT
solution. Someone who offers a guaranteed Safe Passage
that assesses your needs to implement a customised
program to ensure that no data is lost or compromised.
Loss of control over services and/or data was the
second largest concern around cloud adoption
Losing the keys to your data to a third party is a big
concern for companies who need to be able to access
their resources quickly. Traditionally, businesses would
have to provide their managed services provider with API
keys and administrator credentials so that they could
manage the network. By choosing the right partner who
truly understands public cloud infrastructures you should
look to circumvent this necessity. By using AWS Trust
Relationships and Security Token Service (STS) software,
your partner should be able to effectively run your system
without needing the keys, allowing companies to retain
complete control over their virtual infrastructure and data.
The third largest concern was the belief
that cloud systems are too expensive
Considering the velocity at which IT is changing and
progressing, keeping solutions in-house is expensive.
Traditionally, it made sense for companies to invest heavily
in private data centres where their own IT specialists
could manage hardware and software solutions.
With the introduction of the five-minute data centre, firms
can now create and destroy information in a matter of minutes
on an IaaS platform and housing such an expanse of data on
traditional infrastructure increasingly making less sense.
So, use cloud solutions and IT experts who are flexible and
move with the market so that your business is never on its
back foot when it comes to IT. By assessing and implementing
Top concerns around cloud adoption
the correct solution for your business. Ensure efficiency
pay for the storage that you need, without overpaying
for unnecessary provisions. Use a solution that is agile
and nimble enough to move with you so that your price
plan is reflective of the storage and resources you use.
Accountability and privacy are a worry in the cloud
With the recent barrage of hacking scandals and breaches,
there is a lot of discussion around the threat of cyber
criminals to cloud technologies. One of a company’s biggest
security threats comes from employees who fail to use
best practices and unintentionally expose their data to
outside threats. It is imperative that you have a formal plan
to build on security best practices that includes two-factor
authentication, role-based access and credential security.
Non-repudiation or the lack of accountability, if a problem
does arise, is a concern when businesses outsource to third
parties. Business owners need to precisely track the actions
of all users. These actions should be tied back to unique user
names and be visible to both the business and 3rd party
managing the cloud services. Once all system access and
activities are monitored through unique user names, you
can react to suspicious activity quickly and effectively.
When to turn to a multi-cloud strategy
for enterprise IT
By this point, most enterprises are firmly committed
to cloud services. This technology has yielded
tremendous benefits, from cost savings to improved
Today’s CIO
21
Equinix LD6 Smart
Data Centre use
Oasis™ Indirect
Evaporative Coolers
Equinix, one of the world’s largest data
centre operators and Internet exchanges is
set to open their sixth London data centre
LD6 in Slough.
The company will use Munters Oasis
Indirect Evaporative Coolers at both sites
to achieve world leading sustainable server
climate control, meeting the demand from
Equinix’s customers in financial services,
cloud and enterprise segments. LD6 aims
to be accredited in Leadership in Energy &
Environmental Design (LEED) to platinum
level.
Forty Oasis IEC 200 systems inc four
MUA units for LD6 Slough are at the
heart of Equinix’s air treatment design,
and will contribute towards lower energy
consumption and carbon footprints. Munters
Oasis IEC 200’s will supply over 8MW of
cooling for the IT Load for the first phase of
LD6. The new LD6 data centre will provide a
capacity of 8,000 square metres equivalent
to housing 2,770 server cabinets.
Today’s CIO
High efficiency cooling
Annualised cooling pPUE of 1.06
Fully separated air
Use of any fresh water type
Annualised project PUE of 1.2
Equinix ‘s investment in the LD6 data centre,
utilises Oasis’s award winning innovative
patented indirect evaporative heat
exchanger and 100% natural ventilation.
“LD6 is a hugely exciting project; the facility
will be the most advanced data centre in
the UK. We are committed to providing
continuous improvement for our customers
and set new standards in efficiency and
sustainability,”
said Russell Poole, managing director of
Equinix UK.
22
•
•
•
•
•
Full case study: www.munters.com/equinix
[email protected]
+44 1480 410223
Big Data
flexibility and beyond. Cloud sceptics have been
won over and cloud proponents proven right.
With the cloud’s value firmly established, the challenge
that enterprise IT decision-makers now face is determining
how to develop an optimised approach for maximising
cloud benefits. Increasingly, these leaders are coming
to realise that the answer often lies in a multi-cloud
strategy. In many scenarios, multi-cloud deployments will
deliver the greatest results for enterprises of all kinds.
Here are three of the situations where a multi-cloud
strategy will make the most sense for enterprise IT.
1. Decreased reliance
One of the most common and significant reasons why
enterprises may want to turn to a multi-cloud deployment
is to reduce their reliance on any given cloud services
vendor. In a traditional private or public cloud approach,
the company will simply move all relevant operations and
assets into a cloud environment hosted by a single given
vendor. This is also frequently, although not always, the
case in a hybrid approach that combines elements of
both private and public cloud into a single solution.
In all of these scenarios, the enterprise will inevitably end
up heavily reliant on whichever cloud vendor it has partnered
with. This may not be much of a risk assuming the enterprise
is leveraging an exceedingly reliable service provider, but
there will always be a chance of disruptions and outages.
For many enterprises, such risks are unacceptable.
By turning to a multi-cloud deployment, enterprises can
effectively avoid putting all of their eggs in one basket,
as they will never be fully dependent on a single cloud
service or vendor for their hosted needs. As a result,
companies can achieve a greater degree of autonomy.
2. Diverse needs
Another key reason for enterprise IT to pursue multi-cloud
deployments is to satisfy diverse cloud needs. While there
are many expansive cloud platforms on the market today,
there will still be many situations in which organisations
simply cannot meet all of their IT goals through a single cloud
solution. Or, more commonly, the ideal option for one cloud
need will differ from another. For example, an enterprise
may want to take advantage of AWS services for storage
and Azure for Internet-facing public Web applications.
A multi-cloud approach enables the business to utilise
both of these resources, whereas a single cloud strategy
will force the firm to go all-in on AWS or Azure exclusively.
Multi-cloud’s best-of-breed approach offers the
potential for optimised performance and outcomes.
3. Test and see
Finally, enterprises may turn to multi-cloud strategies as a
means of conducting trials of a number of different cloud
options. While the cloud adoption process has become
much simpler over the course of the past few years,
there is still a fair amount of time and effort required
when embracing a new cloud solution. Doing so on an
organisation-wide scale is an even more massive endeavour. Before fully committing to such an expansive, allencompassing cloud strategy, enterprises may want to see
how well different cloud services function in various parts of
the organisation. In essence, this approach uses multi-cloud
deployments as a means of testing out the different cloud
solutions on the market prior to making a comprehensive
shift to a singular approach. While certainly not necessary
for every enterprise, this strategy can offer major benefits
for large, high-stakes cloud migration plans.
Further information
Datapipe architects, deploys and manages multi-platform
hybrid IT solutions for the enterprise, including traditional
IT, public cloud, private and hybrid clouds. Datapipe offers
unparalleled SLA’s in the cloud space for Enterprises
looking to deploy mission critical IT infrastructures in both
public and private cloud environments including both AWS
and Azure.
Datapipe delivers operational excellence globally, through
a team of experienced professionals, best of breed IT
management applications and next generation data centres.
As such Datapipe is recognised by Gartner as a leader in
cloud enabled managed hosting solutions and is recognised
by Forrester as a global leader in Hosted Private Cloud
solutions.
Mark Underwood
Vice President EMEA – Datapipe
[email protected]
+442075152555
http://www.datapipe.com
https://twitter.com/Datapipe
Today’s CIO
23
TODAY’S
CIO
O
TODAY’S CE
F
C
S
’
Y
A
D
TO
O
Spring/Summer 2015
Spring/Summe
r 2015
r 2015
Spring/Summe
Do I really need
to worry about
data centres?
the
Adjusting to
cheap oil
new world of
tomers
Are your cus to be?
im
who they cla
s,
Move over boy
women mean
too
ess
busin
Sponsored by
Rotterdam
image
trades up its
■
■
■
www.spartapublishing.co.uk
Tel: +44 (0)20 7970 5690
40 Bowling Green Lane, London EC1R 0NE
Ahead in the cloud:
how to keep your
data secure
A sharing ec
on
utilises spar omy
e capacity
Bishops and
actresses of
fer
help to MBA
students
How safe is
da
stored in th ta
e clouds?
Cutting yo
carbon footpr ur
can cut cost int
s too
Sponsored by
Security
Network security is in the details –
the destruction of the Death Star
By Ian Whiting, CEO of Titania
THERE ARE A GREAT DEAL OF SECURITY LESSONS HIDDEN
in the plots and sub-plots of Star Wars – data security,
hackers-for-hire, user error etc. However, what better suits
the information security industry other than the striking
moment that saw the Death Star exploding into glittery
stardust? A chain of vulnerabilities and risk mismanagement
ultimately lead to the unthinkable, the destruction of the
Empires’ superweapon due to an exhaust vent vulnerability.
There is a case to be made that network security lies
in the detail, especially with the rise of the advanced
persistent threat and the development of cyberespionage
worldwide. Criminals acting in the virtual space have long
renounced the generic approach and have instead adopted
a highly targeted crime deployment. Security measures
must come to reflect this shift. For this, Star Wars shows
us how attention to detail can be equally applied to your
organisation for a more efficient defence of the network.
Advanced persistent threat:
operation “Death Star”
The Death Star was an impressive military and political
superweapon designed to annihilate entire planets.
Yet in spite of its mightiness, the Death Stars’ defence
was surprisingly vulnerable to attacks – one small
weakness led to a devastating end result. An assessment
of its vulnerabilities was long overdue and it may
have been a chance to re-write Star Wars history.
1. Network reconnaissance
Rebel spies led by Princess Leia manage to get possession
of the Death Star’s plans, but their ship falls to the Imperial
forces. Leia alone cannot analyse the information she
retrieved. Instead she finds a way of transmitting the data
back to her father’s home planet of Alderaan for further
investigation, by storing the plans in the memory of R2-D2.
At this stage, Leia is captured by the Empire. For the time
being, the Empire is unaware of Leia’s mission purpose. The
princess insists they are there on a diplomatic mission.
Malware with backdoor capacities can infiltrate a
network and remain undetected for years, while leaking
information. For example SEDNIT infectors in operation
Pawn Storm contained mainly backdoors designed to steal
system information and send it to remote C&C servers.
Another example is the highly modular Snake (aka
Uroburos) operation which indicates that the rootkit
had gone undiscovered for at least 3 years, with a
great ability to hibernate for a number of days, which
made it untraceable even to professional eyes.
2. Outsourcing – “Hacking-as-a-Service”
Leia’s stolen plans reach the hands of Luke and Obi-Wan
Kenobi who decide they must follow Leia’s instructions and
reach Alderaan. Luke and Obi-Wan need extra assistance
so they contract the services of mercenary Han Solo, who
can transport them on his ship, the Millennium Falcon.
A coordinated cyberattack can involve multiple actors
taking part, to accomplish various roles along the way. The
underground forums of criminal activity are rife with hackers
of various skills and knowledge that offer their services.
Off-the-shelf tools are also popular either on a one-off
basis or as a contractual service, including updating and
maintenance work. The Silver Spaniel uncovered in 2014,
shows a relatively simplistic campaign which did not build
any software, but outsourced commodity tools available on
hacking forums instead. The attack required little technical
skill, yet it provided scam artists with a prosperous business.
3. Response SIEM – quarantine and counter-attack
The Millennium Falcon has to re-route, in order to reach the
Today’s CIO
25
Security
rebel base Yavin 4, as Alderaan was destroyed by Grand Moff
Tarkin in a demonstration of the Death Stars’ capabilities.
However, the Millennium Falcon gets captured by the Star’s
tractor beam and brought into its hangar bay. When escaping,
the ship manages to evade the Death Star, but at this
point it carries a tracking device which enables Tarkin and
Darth Vader to monitor them all the way back to Yavin 4.
Network defence approaches focused on threat
identification and event management (SIEM) would at
this stage identify a breach and trigger security alerts.
An alert system would provide the CISO with the choice
of further monitoring or ignoring the threat. We see that
the Tarkin and Vader choose to monitor the Falcon and
track it back to base. Yet, without a comprehensive risk
management view of the Death Star’s vulnerabilities,
they ignore the possibility that the rebels would “dare”
target the core of the Star and fail to secure the ports.
4. The attack vector
The Falcon finally reaches its destination and they hand
the plans over for analysis. The examination reveals a
vulnerability in the exhaust port that connects to the station’s
main reactor. Once the weakness was identified, an attack
mission is set up and Luke joins the assault squadron.
In 2014, The Mask (El Careto) was revealed as one of
the “elite” APTs. Its deployment against carefully selected
targets included monitoring infrastructure, shutting down
operations, avoiding detection by wiping instead of deletion
of log files and others. Its purpose was cyberespionage, but
the attack vector was a combination of social engineering and
rare exploits for Java, Chrome, Firefox and other browsers.
Campaigns like The Mask show us that the wide range
of tools and the extensive pre-planning work conducted
before setting up the attack vector remain the most
unpredictable part of the threat. Security and risk
managers are often unaware of the “open ports” and
struggle to discern between critical and minor threats.
An auditing process with clear flags for threat level is the
only way to ensure that malicious actors do not achieve
a more efficient assessment of your network than you.
5. Exploit
After a number of battles, Luke assisted by the Force and
under Obi-Wan’s spiritual advice is able to fire proton
torpedoes into a small thermal exhaust port along the
Death Star’s equatorial trench. This leads to the memorable
image of the Death-Star exploding into space.
The BlackPOS family that ultimately led to the
breach imposed on Target is a good example to the
destructive effects that an undetected vulnerability
can have to the security of a network, and finally to
the reputation of an organisation. It is now known that
the BlackPOS campaign operated through 3 different
strains of malware, all following a similar behaviour:
infiltration, memory scraping and exfiltration.
Target did have a security team in place to monitor
its systems around the clock. Hackers managed to
avoid detection while setting up their malware, but
when they proceeded to the final stage – uploading the
exfiltration malware – alerts went off in Target’s security
department and then…nothing happened. The alarm
was triggered early enough, before any data got leaked,
yet the security operations centre chose to ignore it at
that stage. The reasoning has never been disclosed.
As we see earlier in the film, despite being aware of
the thermal exhaust port, the Empire decidedly had
not taken steps in securing it. The reasoning can be
inferred from their conversations: too insignificant
and too dangerous for the rebels to target it.
There is an important point to make here that regardless
of a networks security system and even quarantine or
counter-attack measures, there is also a great need
for a healthy auditing practice, in order to identify your
weaknesses before attackers get chance to exploit them.
The final facilitator that led Princess Leia and then Luke
Skywalker to succeed in their mission was the Empire having
failed to design a correct risk management framework.
The accounts of many breaches provide sobering
lessons in how organisations can have wide ranging “big
picture, big budget” defences but leave vulnerabilities
in everyday housekeeping. With the Death Star it was
an exhaust vent, with your organisation it might be an
out of date firewall, or a default password that had not
been reviewed during your last pen-test. Monitoring
the details can make the difference between a secure
empire and an embarrassing and very public explosion.
The words of General Dodonna, upon analysing
the smuggled plans, can be the words of any hacker
assessing the entry points of your network: “Well, the
Empire doesn’t consider a small one-man fighter to be
any threat, or they’d have a tighter defence.”
Further information
Ian Whiting is the CEO of Titania, a cybersecurity auditing
software provider. Ian has dedicated the past 15 years to
working with leading global organisations and government
agencies to improve computer security. His efforts have
been recognised by the industry when he was awarded the
“Personal Contribution to IT Security” title during the
Computing Security ceremony.
In 2009 Ian Whiting founded Titania with the aim of
producing security auditing software products that can be
used by non-security specialists and provide the detailed
analysis that traditionally only an experienced penetration
tester could achieve. Now used in over 60 countries
worldwide, Titania’s Nipper Studio and Paws Studio auditing
software has earned countless awards and industry
recognition.
Website: www.titania.com
Tel: +44 (0) 1905 888785
Twitter: @TitaniaLimited
Today’s CIO
27
Security
Privilege gone wild:
The state of privileged account management in 2015
By BeyondTrust
EACH YEAR, AN INTERNATIONAL SURVEY1 IS CONDUCTED
to identify trends in the privileged account management
market. The survey explores how organisations view the
risk from privileged account misuse (either malicious
or inadvertent), as well as trends in addressing and
mitigating the risks. Over 700 information technology
professionals participated in the 2015 survey,
representing organisations in retail, government,
education, manufacturing and technology markets.
Privileged account management was particularly important
to organisations participating in the survey this year. You
don’t have to look much further than the Sony breach and
other recent data breaches to understand why. The use of
stolen credentials is the most prevalent attack vector. From
an adversary’s standpoint, stolen privileged credentials are
the perfect vehicles from which to execute an attack.
However, stolen credentials are not the only risk from
elevated privileges. Malicious insiders may use their privileges
to access, share or steal sensitive data. Inadvertent abuse is
also a concern. Employees with elevated privileges may access
sensitive data out of curiosity, move data to unauthorised
cloud storage for use off-network, or install unauthorised
software such as file sharing applications that put data at risk.
User and account management is information
security at its most basic level. Best practices provide
users with unfettered access to the data and programs
required for their jobs, while preventing other actions
that might introduce risk to the organisation.
With multiple stakeholders, the ability to
view the information needed for each role in the
organisation is critical. Unified reporting was cited
as a critical feature by 56% of the respondents.
There are gaps in privileged account coverage,
but companies are concerned about costs
While their intentions are good, many organisations have some
work to do. Over 47% of the respondents reported that users in
their organisations possess elevated privileges not necessary
for their roles, and 20% report that over 3/4 of their user base
run as administrators. One third (33%) of the organisations
participating in the survey report having no policies (much less,
controls) for privileged password management. This policy
choice greatly increases the organisation’s attack surface
across which malicious or accidental breaches can occur.
The perceived cost of purchasing, implementing and
managing the privileged account management solutions may
be a deterrent to faster adoption. Respondents cited CA,
Dell, and CyberArk as having the highest perceived price.
Survey results
As one might expect from such a diverse set of
industries, opinions differed on many of the particulars
about the need, deployment and rollout of privileged
account management. Despite their many differences,
five key points came through across all segments.
Risk is recognised, and control is viewed as a
cross-functional need
While security teams drive privileged account management
purchases in 82% of respondent companies, privileged
account management risk crosses functional boundaries.
IT is responsible for maintaining systems, and Security
is responsible for the integrity of the information. It
therefore makes sense that compliance (57%) and
operations (43%) are also involved in decision-making
and management. It is also encouraging to note that over
82% of the respondents report that their organisations’
IT operations and security teams “work well together.”
28
Today’s CIO
Privileged user risk is expected to increase
84% of respondents believe the risk to their organisations
from privileged users will increase over the next few years.
The greatest perceived risk is to business information,
including corporate intellectual property, source code,
design documents, trade secrets, and compliancerelated data such as PII (42%). This makes sense given
the increase in media coverage of corporate espionage,
which is typically viewed as a risk from insider abuse.
Need for greater accountability over privileged passwords
The survey shows that organisations believe there is an
opportunity to improve their controls over privileged user
accounts. The common method for managing privileged
accounts is through shared passwords. Just over half (51%)
of the respondents state that shared passwords are managed
“individually.” This could include users sharing passwords on an
ad hoc basis, or simply by memory. 35% indicate that shared
Many Devices.
One Solution.
Absolute Software provides endpoint management and security for all of your
desktop, laptop, tablet and smartphone devices – regardless of user or location.
Optimise productivity, reduce operating costs, prove compliance, and remotely
secure all of your devices and the corporate data they contain.
absolute.com
Security
passwords are controlled “locally,” including spreadsheets,
password vaults, SharePoint, and Active Directory.
end users is an ideal starting point at which to deploy a
single solution to address a very large attack vector.
Inadequate controls over business-critical,
tier-1 applications
In the current environment, business critical, tier-1
applications are attractive targets for adversaries. For those
systems (for example Linux and UNIX servers), 58% of the
respondents believe their current controls against misuse are
inadequate, immature or non-existent. Clearly, addressing this
shortcoming should be a priority for these organisations.
Address the risk of privileged password misuse
Deploy controls that provide system administrators (and
others) with unique, time-limited privileges that can be
controlled at a granular level and monitored. Most important,
maintain a detailed audit trail for spot-checking and
compliance reporting. In the event of a breach, audit logs
can be used for forensic analysis and accelerate incident
response, limiting damages to corporate assets and brand.
Best practices for managing privileged accounts
How to get started
Consider the implementation of the following best practices
to improve the control and accountability over privileged
accounts.
Privileged users play a necessary role in every organisation, but
not all users require elevated credentials. A logical, enforceable
policy for defining the required privileges for each class of
users improves security by reducing the organisation’s attack
surface. Begin with a decision-making team that includes
representatives from all affected stakeholders, then:
■ Simplify – look for solutions that are crossplatform and account for the reporting and
auditing needs of all stakeholders.
■ Discover and prioritise – understand who
has elevated privileges, and target your
most critical applications and users.
■ Implement – start with a single use case
to build internal procedures.
■ Enforce – policies without enforceable controls
do not address inadvertent errors or malicious
behaviour from inside or outside the organisation.
■ Expand – address internal and external users with
legitimate access to your data and systems.
Assemble cross-functional teams to address
privileged account management
Managing and monitoring privileged users is a task that
crosses functional boundaries. While many super users and
system administrators may reside in IT operations, normal
checks and balances would dictate that security groups
provide oversight. By including all stakeholders (including
vendors and contractors who connect into your network) in
the selection, implementation and management of privileged
user accounts, organisations also have the opportunity
to enhance the relationships between these groups, and
reporting and analytics needs can be identified early.
Deploy policies for controls and enforcement
It is clear that privileged user account policies should
be in place in all organisations. However, efforts may be
wasted without controls that enforce those policies. Not all
breaches of policy will be malicious, but inadvertent errors
can still disrupt systems and expose sensitive information.
A privileged account management strategy should include
provisions for exception handling and workflow to allow
users to perform their jobs in an unfettered manner, while
maintaining centralised control over critical assets.
Prioritise implementation by the consequences of
a breach
Often, the most difficult task in managing privileged user
accounts is deciding where to start. While beginning with end
user accounts is good, rollouts should be prioritised based
on the consequences of a breach. Factors to consider include
financial, reputational and regulatory costs to the business.
Prioritising privileged account management implementations
allows organisations to build their own best practices for
deployments, including communications with affected users.
End-user privileges are a good starting point to
address gaps
As seen in the survey results, many organisations do not
manage the privileges of any end users. A discovery exercise
to identify all the privileged accounts in the organisation
can provide a baseline. The population of unmanaged
Finally, monitor your program and policies. Use products
to discover how often users require enhanced privileges and
for what purposes. This will allow you to adjust and improve
policies based on empirical data within your organisation.
Reference
1. BeyondTrust, 24/3/2015, Privilege Gone Wild Survey
Further information
BeyondTrust is a global cyber security company dedicated to
proactively eliminating data breaches from insider privilege
abuse and external hacking attacks. More than 4,000
customers around the world rely on BeyondTrust Privileged
Account Management and Vulnerability Management
solutions to shrink attack surfaces and identify imminent
threats. The company’s integrated risk intelligence platform
presents a unique advantage to reveal critical risks hidden
within volumes of user and system data. This unifies IT and
Security departments, empowering them with the information
and control they need to jointly prevent breaches, maintain
compliance, and ensure business continuity.
Today’s CIO
31
wickhill_survey_advert_cio_magazine.ai 1 21/01/2015 11:21:26
C
M
Y
CM
MY
CY
CMY
K
Security
A brave new world:
understanding the new EU data protection regime
Jonathan Armstrong, data regulation
advisor for Absolute Software and
technology lawyer at Cordery
THE CURRENT EU DATA PROTECTION LAW NOW SEEMS
positively Stone Age. It belonged, perhaps, to a different time.
It came into force in 1995 when the Internet was still in its
infancy. Back then about 1% of Europeans used the Internet.
Things have changed. Every two days we create more data
than we did from the dawn of civilisation until 2003.
The European Commission announced its long-awaited
proposals on what are likely to be viewed as drastic changes
to data protection law in Europe on 25 January 2012. The aim
of the proposals is to make EU privacy laws fit for the 21st
Century and they seek to both change the system and increase
the penalties for a breach. They also seek to introduce data
breach laws similar to those which exist in most U.S. states.
It is still in draft stage and is not expected that legislation
will come into force before 2017 but businesses must start to
prepare now to account for a serious regulation upheaval. This
is not a fine tuning of the law. This is a fundamental change.
Some of the major changes include:
■ Consumers having easier access to their data.
■ Consumers having a statutory right to
have their data deleted on demand.
■ Organisations having a legal obligation to notify
the authorities about data breaches as early as
possible and “if feasible” within 24 hours.
■ The mandatory appointment for companies
with 250 or more employees of a suitably
trained data protection officer.
What are the new penalties?
The new law is raising the stakes. There are increased
sanctions including fines of up to €100 million or
up to 2% of annual global turnover – whichever is
greater. This compares with a current maximum fine of
£500,000 under the UK Data Protection Act 1998.
Former EU Commissioner Viviane Reding has also said
that companies could be fined up to 0.5% of their annual
turnover if they charge users for Subject Access Requests
(SARs). She said that this sum should double if a firm refused
to hand over data or failed to correct bad information.
This is draft legislation with very heavy penalties.
Where does it apply?
What does the new law do?
The new law is much more specific, much more demanding
and much more onerous. The Commission says that the
new law tries to address how to deal with the problems
of big data and social media, BYOD and of the Cloud.
Whether they do this successfully remains to be seen.
The new law will apply throughout the EU and to organisations
outside the EU that are active in the EU market and offer
services to EU citizens. This means, for example, that a
US software company with all its offices in the US, that
handles the data of EU citizens can be investigated,
fined and even prosecuted by an EU Regulator.
Today’s CIO
35
Security
A statutory right to be forgotten will be introduced
The new EU data protection regime also includes
a statutory right to be forgotten. It says:
“Any person should have the right to have personal data
concerning them rectified and a ‘right to be forgotten’ where
the retention of such data is not in compliance with this
Regulation…this right is particularly relevant, when the data
subject has given their consent as a child…and later wants
to remove such personal data especially on the internet.”
The Google “right to be forgotten” case has shown the vast
numbers of people who may apply to have their data wiped.
Companies will need to be much more organised
in handling deletion requests from, potentially
millions, of people from across the globe.
Part of the solution for large organisations will
include the ability to manage data across its device
estate to ensure rapid deletion. In these days of
mobile working, sophisticated software and systems
are likely to be needed to ensure that data across the
enterprise remains secure, accurate and up to date.
A single set of data protection rules
The current law is a Directive and therefore it was left
to EU member states to decide how to bring it into
force under their own domestic legislation. In many
respects, it was this which led to inconsistency.
The current proposal is that the new law will be a
Regulation. This means that it is directly applicable
and Member States will not be required to pass
their own laws in order to bring it into force. This
should mean one law in force across the EU.
One important change is that the requirement to
register data collection and transfer in each country may
be removed. Organisations will instead deal with a lead
country that will regulate their activity across the EU.
Investigations however are likely to still be conducted
by the regulator where the complainant is based.
Further responsibilities under the new law
There are two further heavy burdens under the new
law. The new law says that communication should be
made to the data subject “when the personal data
breach is likely to adversely affect the protection of
the personal data or the privacy of the data subject.”
There is a caveat to this if the company can show
that they have in place “technological protection
measures that shall render the data unintelligible to
any person who is not authorised to access it.”
This means that if there is a data loss of 100,000
customer’s data in relation to a lost employee iPhone, then
a company will have to tell those 100,000 customers that
their data may have been compromised. The legal blowback
from 100,000 irate customers could be considerable.
The brand damage, litigation and media reporting of an
incident would all be significant. If a company has, for
example, software in place to prevent this data on the
iPhone from being intelligible to those who might seek to
exploit it then the company could avert the disaster.
36
Today’s CIO
Added to this, if there is a huge data breach
concerning millions of data subjects all across the EU,
the company will have to deal with each supervisory
authority, each language, each notification requirement
in every country where there is a data subject affected
by the breach. Legal advice will be very important
particularly in the first 24 hours after the breach.
Another important feature of the new law is the
introduction of a corporate data protection officer.
This is a mandatory appointment with specifically
defined responsibilities for organisations with
more than 250 employees. This means that all
over the globe companies of this size will need to
make someone responsible for data protection.
Conclusion
We know that the new regime will bring considerable
responsibility and sanctions for companies that
handle data. We must not forget that it will also
bring considerable opportunities for those that help
them avoid these pitfalls. This law will be radical.
There are two sorts of companies. The ones who
will make changes to their data protection policies
once the law comes into force and those who are
already making changes to allow for the law coming
into force. The responsibilities are so vast that it is
important to get the correct advice and to act now.
Companies should act now to:
■ Draft data deletion policies to comply with
the statutory right to be forgotten.
■ Take steps to appoint and train the
right data protection officers.
■ Look at software solutions to help manage the risks.
■ Draft policies to comply with the breach
notification requirements.
■ Train employees on the risks and how to reduce them.
There will be significant challenges to comply with the
new law. Uncertainties remain. What is certain, however,
is that organisations will have to start now to have a
chance of complying when the new laws come in.
Further information
Absolute Software is the industry standard for persistent
endpoint security and management solutions for
computers, laptops, tablets, and smartphones. As IT
security and management requirements have converged,
we’ve responded by providing our customers with a
consolidated experience, allowing them to manage and
secure all of their endpoints regardless of user, location,
or form factor. This singular view of the IT landscape is
extended to include IT processes and infrastructure with
our IT service management offering.
Security
Ahead in the cloud:
how to keep your data secure
By Kevin McLoughlin
Twinfield UK Head of
Strategic Partnerships and
Product Wolters Kluwer
WITH COMPANIES INCREASINGLY
turning to cloud-based technologies
to find productivity and efficiency
gains, it’s hardly surprising that
guidelines around the use of outsourced
solutions continue to be updated and
strengthened. In recent years there
has been a focus on companies’ use of
third-party technologies – particularly
around the use of externally-hosted
software as well as regulations around
data security. However, it’s not just
a case of preserving confidentiality
and privacy with customer data; it’s
about ensuring that data governance
standards and processes are built
into crucial functions such as
financial reporting and accounting.
How secure is secure?
Although no cloud provider can claim
to be 100% secure, the cloud is now
widely recognised as being no less
secure than much software located
within an organisation’s premises –
and more secure in many cases. For
example, what happens in the case
of a fire or flood? Storing data in an
office or on a local server is much more
likely to be destroyed beyond recovery
than it is when stored in the cloud.
Responsibility for data security rests
with the client organisation, not the
vendor therefore businesses should
do thorough research into third party
suppliers before adopting any cloud
strategy. Any BDO-certified supplier
Today’s CIO
37
Security
is thoroughly – and independently –
audited four times a year to ensure
processes and systems are sufficiently
robust and meet ISO requirements.
Location, location, location
Despite misconceptions that data is
‘floating around in Cyberspace’ it’s
crucial that data is held in several
locations to allow the use of active or
active centres so that an immediate
switch can be made should an incident
occur. Furthermore, an additional
location should be used for regular
data back-up in case of an incident
or breach. According to a recent
survey by iGov, 81% of businesses
suffered a security breach in 2014*,
therefore the location of data has
never been more important.
Only two parties should have
access to your clients’ data: you
and your clients. Moreover, only
employees of the cloud vendor
should have permission and access
to customer’s data. Not only is it
important to preserve confidentiality
and privacy with customer data, it’s
about ensuring that data governance
standards, such as ISO 27001
(Information Security Management)
are built into crucial functions such
as reporting and accounting.
Security as standard
Protection, privacy, back up and
encryption are all processes and
systems that cloud providers should
have as standard. Security audits
to help with compliance such as ISO
9001, SAS 70, ISAE3402 and PCI
38
Today’s CIO
(Payment Card Industry) as well as
planned hacks on your data are vital.
Cloud providers should repeatedly
back up all their clients’ data. In addition
to regular back-ups, data should be
stored and available on request for a
minimum of seven years and should be
covered under European legislation.
Full storm ahead
The benefits of the Cloud are clear – it
can help to empower businesses by
enabling them to access their data from
any location. Costs can be reduced,
as there is no longer a requirement to
maintain your IT system – upgrades and
new hardware or software are no longer
needed. More collaborative and flexible
working practices along with automated
processes harnessed by the cloud, can
help to improve communication and
increase productivity. Furthermore,
your cloud provider can scale up or down
your system and storage needs should
your circumstances change, so that you
can remain focused on the business.
Security should naturally be at the
forefront of an organisation’s mind –
after all, failure to comply with data
standards can expose customers’ (and
businesses’) confidential data to costly
risk, often damaging an organisation’s
reputation. By partnering with a trusted
cloud provider, you can rest assured
that your data is safe, whilst you focus
on getting ahead in the cloud.
*https://www.gov.uk/government/uploads/
system/uploads/attachment_data/file/307297/
bis-14-766-information-security-breachessurvey-2014-executive-summary-revision1.pdf
Further information
Twinfield UK, part of global
information business Wolters
Kluwer, is in the business of cloudbased accounting, an area where
there is a fundamental need for
secure online collaboration between
accountants and their clients.
www.twinfield.co.uk
www.wolterskluwer.co.uk
www.cch.co.uk
Security
The five pillars of the
Cyber Essentials Scheme
Your foundations to good security,
explained by David Parkinson
MUCH HAS BEEN WRITTEN ABOUT THE UK GOVERNMENT’S
new Cyber Essentials Scheme and why organisations should
take the certification and adhere to the standard. It is to
be applauded in its effort to promote and roll out a basic
standard of cyber security not just to those businesses
looking to bid for certain government contracts, but to all
organisations. Within peer groups and supply chains, all
organisations should be able to demand a level of cyber
hygiene from organisations they share information with
and this is where I think the strength of a scheme like
Cyber Essentials lies. As CREST (Council for Registered
Ethical Security Testers) describes, accreditation at
Cyber Essentials or Cyber Essentials Plus should be
seen as a snapshot at the time of assessment but I
would anticipate that perhaps further variations will be
developed that address this once more traction in the
current schemes has followed the early activities.
Underpinning Cyber Essentials are five key controls
– concise points that can be described in simple terms
and that can be addressed using widely available
tools. That availability is of course important if Cyber
Essentials is to be adopted extensively, though some
thought should be put into the selection of those tools
around their effectiveness, implementation, ongoing
maintenance, interoperability and of course cost.
Boundary firewalls and internet gateways
According to the Cyber Essentials Scheme: Summary
document, Boundary Firewalls and internet gateways are
“devices designed to prevent unauthorised access to or
from private networks”. What may once have been a simple
demarcation point, the perimeter of a private network is
under ever increasing scrutiny as boundaries become complex
and private networks are extended into cloud resources. This
then requires consideration in an organisation’s assessment:
whether any contracted external services are to be included
in the scope of the assessment and under whose control
(and responsibility for compliance) they sit. When choosing
a technology vendor to work with in this area, it is becoming
increasingly important to work with a vendor that has a
clear product set that can address cloud deployments.
While the emphasis of Cyber Essentials is placed on the
firewalling capabilities of gateways, to be fully effective
they should also prevent “unauthorised access” from new
and constantly evolving attack vectors rather than simply
providing stateful inspection of network connections. In
other words these should be platforms on which multiple
layers of protection can work together to stop attacks
from being launched through email campaigns, drive-by
infections by innocent (or curious) web users, botnet activity
and of course the as yet unknown attacks for which we
cannot rely on signature based technologies to stop.
Secure configuration
The second key Cyber Essentials control states that
systems should be configured “in the most secure way
for the needs of the organisation”. Clearly this indicates
that care should be taken in the implementation of the
protections to ensure that some basic system hardening
takes place – for example default passwords on network
equipment are replaced with strong ones, unnecessary
user accounts are removed or disabled and also that a local
firewall be installed and configured on computing devices.
The implementation of such a firewall, installed such
that it can block unapproved requests, could be more fully
extended for example to control which applications are
approved to run on the organisation’s workstations. Further,
it could include mobile computing devices under a Bring
Your Own Device (BYOD) scheme. As these devices may
well store and process sensitive information and connect
to the Internet, they are well within the scope of Cyber
Essentials assessment and should ideally be subject to the
capabilities of remote wipe and sandboxing technology to
keep personal and work data separated. While outside of
the remit of Cyber Essentials, a diligent additional measure
that might also be added with this control in mind would
be a form of document security: applying encryption to a
document rather than encrypting the place where it was
stored, and the application of usage rights for different users.
Access control
The Cyber Essentials control of Access Control relates to
not only who should have access to systems but also that
they should have access at “the appropriate level”. Basic
points are given in the Cyber Essentials summary document
including the need for users to authenticate themselves
Today’s CIO
39
Security
using strong passwords before accessing systems and
applications, though it is also worth considering at this
point the use of a two factor authentication technology.
There can often be ongoing operational benefits
to implementing a something you know, something
you have system (usually a physical or mobile-app
authenticator generating a One Time Password). For
instance, help desk calls relating to strong password
resets can be dramatically cut and Single Sign On (SSO)
projects can be implemented with far greater trust.
In addition to user authentication we should also consider
whether devices should be allowed to join our wired and
wireless networks, and what access they should have to other
devices when they do join the network. A good Network Access
Control (NAC) product should offer a means of segmenting a
network that focuses on a device’s access to other systems, as
an alternative measure to just focusing on the user. In BYOD
projects or where guest and contractor access is common,
a smart NAC system can easily place allowed devices onto
appropriate logical segments of the network where the scope
of their access is limited to that which is really needed.
Malware protection
When the Cyber Essentials Requirements document
discusses the need for anti-malware software it is clear in
its requirement for such software to at least be installed
on computers which are exposed to the Internet. We should
also remember that BYOD projects are in scope for Cyber
Essentials assessment and provision should be made to
include anti-malware software for these devices too.
This anti-malware software should be configured
to scan files automatically upon access and scan web
pages as they are loaded. They must be configured to
update themselves automatically (both application
itself and signature files) and should implement a web
site filter, or blacklist, of known malicious sites.
This sounds a comprehensive strategy for protecting
workstations, though could be taken into account with
the requirement for a personal firewall from the Secure
Configuration control and the probable requirement to
cover mobile devices in a similar manner. Rolled together
there should be a clear benefit to administering all of
these elements together through one “pane of glass” and
a single deployment of “endpoint” technology, let alone
benefits in simplified training and procurement.
Patch management
Cyber Essentials states a key control which requires
software running on computers and network devices
to be kept up to date and patched with latest security
patches supplied from vendors. Even in fairly small
organisations the task of patching operating systems,
applications, utilities and auxiliary programs such as
Java and Acrobat can soon become unmanageable.
By tying in a patch management system to the rest of
the controls aimed at computing endpoints not only allows
efficiencies be made over a piecemeal approach but can
achieve increased security through simplified policy setting
40
Today’s CIO
in a single console. Combined reporting and alerting across
these controls can also give a much clearer picture of the state
of the computing estate within scope of the assessment.
Care should also be taken to ensure that selected firewalls,
Unified Threat Management (UTM) appliances and other
network devices are capable of having their program code
and protection service signatures updated simply, and
most importantly reliably. Where these devices are mission
critical to an organisation, a highly available configuration
should be considered so that managed patch rollouts can be
confidently deployed frequently rather than being saved for
infrequent major release upgrades. It’s worth noting that the
Cyber Essentials guidance is that security patches should be
installed within fourteen days of their release by the vendor.
Summary
The Cyber Essentials guidance presents a narrow program
for improving basic cyber security for organisations to
guard themselves specifically against phishing and hacking
attacks which require low levels of attacker skill, and for
which tools can be found easily online. It doesn’t cover
risk management, which is fundamental to managing
information assets, but instead defines basic controls which
organisations can implement to protect their information.
By considering the implementation of these controls
together carefully however, I would argue that greater
security can be gained than by simply addressing each
control separately. If we can recognise that the Cyber
Essentials scheme is the beginning of a process for many
organisations then it should provide a good starting point to
then consider additional controls alongside a rolling program
of staff training and constant review. These might include
application testing and the use of specific web application
firewalls (WAFs), protection from the increasing number of
Distributed Denial of Service (DDoS) attacks, and measures
to perhaps even particularly safeguard data from, and
provide rapid recovery from, the rise of ransomware.
Further information
David Parkinson works with Wick Hill’s vendors and reseller
partners to raise awareness of cyber security issues and
to put together solutions to the business requirements of
their collective customers. Drawing on a twenty year career
in IT and Security, David has experience as IT Manager,
Technical Consultant, VAR and Distributor. Wick Hill Group
provides secure network infrastructure products and
services to Enterprise, Data Centre and SME customers
alike with its partners from offices in the UK and Germany.
Vendors are presented with the support of a full technical
training program, technical support and a full range of
commercial services for its reseller partners.
Security
The security challenge
of the mobile workforce
By Samik Halai, Integral Memory
GLOBAL ORGANISATIONS ARE EMBRACING NEW FLEXIBLE
working practices which have heralded a new era of the
‘always-on’ mobile workforce. Enterprises of all sizes are
reaping the benefits of employees that have constant
access to their work through mobile devices on a 24/7
basis. This connectivity has been proven to deliver greater
levels of productivity, faster response times and a stronger
management network. The momentum behind these changes
has increased over recent years but what are the implications
for data security and what measures work for today’s CIOs?
In reality, it is not the organisation alone that seeks the
benefits of workforce flexibility. Employees themselves
expect to use laptops, tablets and smartphones as a
seamless transition between their work and private
life. In a recent survey, 65% of mobile workers1 were
found to be using a tablet; a practice which is heavily
biased toward younger workers. Given this profile,
the trend toward ultra-mobile devices is likely to
eclipse the use of desktop and even laptop usage.
Growing economic globalisation is also an influential
factor. Differing time zones between colleagues and
business partnerships promotes an ‘always-on’ culture
where mobile access to email and other resources
offer a critical edge in competitive business markets.
Ultimately the flexible workforce with ‘always on’ access
to corporate resources and data, will be universally
acknowledged as a pre-requisite for company growth.
The impact on data security of this expansion of mobile
devices is a major concern facing businesses of all sizes.
The challenge is to retain control whilst offering employees
a mobile user environment which does not create barriers.
The user experience is fundamental issue which calls for
a data security regime that permits the extended use of a
differing formats of devices across a range of operating
systems. However the danger of devices like tablets are
self-evident and private businesses and public sector
organisations are increasingly torn between meeting the
Today’s CIO
41
Security
needs of remote workers and
their duty to protect company
data from a rising wave of
security breaches and threats.
The reality that 1 million laptops
were reported missing in the UK
in 20122 ensures that this issue
will remain at the top of corporate
agenda in the years ahead.
Effective encryption:
hardware vs software
It follows that the ideal solution
effectively manages the risk of data breaches from mobile
devices whilst maintaining invisible protection and a
seamless experience for the user. Stories regarding lost
laptops reported in the media concern security managers
and workers alike. Reassurance that effective security
measures exist on their device is a key user concern.
Essentially CIOs are presented with two stark choices
in their efforts to protect the company’s mobile devices –
encryption software packages or encrypted hardware devices.
Software encryption
Software encryption is often the low cost route to encrypt
data, if the initial cost of the package is taken in isolation.
However, additional costs may apply in the medium term as
additional licensing fees and support fees (quite common
if you are a business user) are incurred. Similar to on board
operating systems, updates to the software are necessary
to ensure that the security is maintained throughout the
software layers. Given that there are other pieces of
software installed on the machine which are additional to the
operating system, compatibility across the various vendors
requires continuous updates for the encryption software.
Perhaps the key negative factor of software encryption
is that it shares the host computer resources to perform
encryption and decryption of the data. Effectively the
authentication takes place in the system memory and
although this can be ‘on the fly’ it is still using memory
and processor resources to complete these actions. The
natural result is slower performance. Some software
encryption packages utilise a TPM chip to store the keys
but the process is still carried out in software utilising the
same system resources and slowing down the machines
ability to perform other tasks simultaneously.
It is additionally important to be aware that software
encryption can be corrupted or negated. Software
running under an operating system can be vulnerable
to viruses and open to other attacks through a virtual
‘back door’ through the interface between the host
machine’s operating system and the actual software
that is designed to protect the system itself.
Hardware encryption
Hardware encryption can initially be the more expensive
option if one only considers the upfront cost and ignores
the TCO. Most hardware encryption devices do not incur
42
Today’s CIO
additional costs such as licensing or support costs – moving
forward. In essence a one off purchase would cover the
lifetime of the product. As per the name, the hardware
encryption is authenticated and processed within the
hardware of the device itself, using an on-board dedicated
processor. Essentially, it operates independently and
does not utilise system resources such as the host system
processor and memory. Leaving these resources free often
gives the user a better experience, whilst still maintaining
a high level of protection. Hardware encryption devices
encrypt and decrypt data instantly and the majority are
unnoticeable to the user and offer 100% mandatory
encryption of all data. The hardware encryption does not
install itself over another software system (such as operating
system) so it is zero footprint and does not require any
software installation on the host system, this makes it less
vulnerable to viruses or other attacks of that nature.
The future for all companies is mobile. The drive toward
remote workforces is coupled with a need for all employees to
be ‘always on’ through mobile devices. Ultimately this will bring
extra strain to bear on CIOs and data security managers who
already have to contend with a rising tide of external threats.
The way forward for any informed mobile security strategy is
the need to provide effective encryption on the devices that
employees prefer to use; laptops, tablets and smartphones.
It is also essential that these encryption methods whether
software or hardware based do not present an obstacle to
the user experience. Delayed boot-ups, constant upgrades
and slow systems must be avoided if encryption measures
covering a growing mobile infrastructure are to succeed.
References
1. IPass
2. The VAIO Digital Business 2013 report by Sony.
Further information
About Integral Crypto
The Integral Crypto range of 256-bit AES Hardware
encrypted drives is trusted by a rapidly growing customer
base, across both private and public sectors – worldwide.
A growing awareness of the dangers of data loss and with
wider penalties aimed to punish instances of poor data
security – Crypto drives have met with a spontaneous
demand. All Crypto models carry highly-regarded FIPS
140-2 certification. Importantly, the technology is bios
independent making it the only SSD available on the market
that utilises its own program to activate and manage the
encryption and enforcement of usernames and passwords.
Essentially a dedicated processor is located ‘on-board’ the
SSD and manages the encryption/decryption tasks without
sharing resources with the host system.
Telephone: 0208 451 8700
Email: [email protected]
Web: www.integralmemory.com
Technology & Innovation
Testing, quality and the quest
to meet customer expectations
By Darren Coupland,
Vice President – Testing Services,
Sogeti UK
IN THE NEW ‘ALWAYS ON’ AND DIGITAL WORLD, USERS’
expectations of a high quality, seamless digital customer
experience are becoming increasingly elevated. Because
of this, CIOs are often caught in the middle between
traditional business IT, and having to support new
technologies – such as Social Media, Mobility, Analytics
and Cloud (SMAC) and the Internet of things – as well
as the growing importance of cybersecurity, in order to
support growing needs from the CEO and wider business
as it becomes more digitally capable. Technology is
changing and so the CIO must adapt to these demands.
Moving to digital can provide an organisation with a
wealth of benefits in the areas of customer acquisition,
engagement and retention, and if we consider the
internal users this also encompasses knowledge
management, productivity and improved motivation.
However, with user tolerance for poor performing
applications at an all-time low, and Social Media providing
a forum for instantaneous feedback which can easily
go viral, quality must increase. The effect of not taking
this on board can be expensive – not only in terms of
cost but also reputational damage of your brand and
organisation. Now is the time to address this by focusing on
increasing quality, security and improving the QA function,
leveraging new technologies and processes to do so.
The latest World Quality Report1 shows us that the
proportion of the IT budget allocated to QA and Testing grew
at a record breaking rate from 18% in 2012 to 26% in 2014
in order to support the move to digital. However, spending
alone is not enough – there are a number of challenges
to consider and overcome which I will explore shortly.
The testing landscape
Here’s a snapshot of the testing landscape supported
with findings from the latest World Quality Report,
showing how digitalisation affects the industry and
setting the backdrop for CIOs who want to support
the business’ drive for competitive advantage.
I already mentioned that QA and Testing budgets have
increased, and it seems that digital transformation is also
driving a shift in the way they are spent. New development
projects that support the digital strategy now account for
more than half of spending, with only 48% going to BAU,
maintenance etc. Of this, there has been an increased adoption
or support and delivery of digital (SMAC) technologies:
Social now accounts for 15% of new development work,
Mobility 17%, Analytics and Big Data 40% and Cloud 27%.
Other trends highlighted in the report include an increased
use of tools and a change in the skills required from test
resources to be more technical (for example the increased
prominence in roles such as Software Developers in Test),
and use agile and automation by default. Spend on tools and
personnel also increased year on year between 2013 and 14.
Processes and mind-sets are evolving to enable fast,
continuous delivery. In fact 93% of respondents say that
they are experimenting with or fully using agile methods
which help to increase flexibility and fast responsiveness.
All of this opens up new challenges for the CIO and testing
team...
A whole world of new challenges
The move to digital is being embraced by many organisations
and government departments, where “Digital by Default”
has become the mantra. This is great for the consumer and
employees however creates additional challenges for QA and
Testing teams, mainly due to the increased level of complexity
of test environments and no second chance when it comes
to quality. The whole buying/using and customer service
process must be delivered and made available perfectly,
wherever the user chooses to access it and to top it all, it
needs to be fast, there is no customer service agent who
can cover up poor system performance by having a chat
about the weather or last night’s television. This would be
difficult enough on one device, but with the explosion of
variants of mobile phones, tablets, e-readers and smart
devices available this can seem extremely daunting.
According to the World Quality Report, 53% now
carry out mobile testing, with a focus on security,
performance, validating the customer experience,
compatibility and regression. Yet there appears to be
issues around the tools, devices and processes that
are needed to support this relatively new field which
also cross into traditional application testing.
Today’s CIO
43
Technology & Innovation
Testing tools have improved over recent years, though
now the proliferation of tools available has created new
challenges in its own right. Do you pay for the best tool
on the market or go down the open source route? And
do your testing resources have a high enough degree
of technical expertise to use them effectively? A tool
is only as good as the person using it, after all.
Along the technology lines is the use of cloud
– migration has gained momentum but many still
cite concerns around security as a key reason for
not moving across. I will touch later on why cloud’s
benefit far outweigh any issues you might see.
Security on its own is also a key factor in terms
of securing applications and data to prevent
against breaches, particularly on mobile and
digital devices which typically have less security
controls than would be found on a desktop.
A real challenge though has to be how to deliver digital
projects fast enough to keep up with business demand.
Key troubles in this area include the adoption of agile
and DevOps, plus how to spin up environments fast
enough to still be able to test early in the development
lifecycle. Agile is no silver bullet and though methods
are widely adopted, 61% still have trouble deciding
the right methodology for their business.
So how do you address these pains and ensure IT and
testing are aligned to the organisation’s digital goals?
2. Processes and mind-set
Though it does have its own problems, working in an
agile environment is beneficial, with two caveats: you
must find the right agile methodology for your business
– whether this is SCRUM, Kanban or a tailored mix of
many available options – and you must be willing to
undergo a shift in mind-set from top to bottom. Without
these two things you will experience difficulties. The
same goes for DevOps, Continuous Integration etc.
Introducing thorough mobile and omni-channel testing
– whether in-house or with a third party specialist – is
essential for delivering the desired levels of performance
and functionality, and measuring that customer experience
is a must. Ensure you have the ability to listen to your
users and make changes when they find a problem –
otherwise you will lose them. Crowd testing can be a great
way to achieve enough test coverage, and to validate the
customer experience. In addition, notifications on devices
which prompt user interaction (i.e. beacon technology
where a user uses an app in store and a notification is
sent to the device to encourage purchase in some way)
is a growing trend which needs to be considered.
Finally, run security testing during QA or within the
production environment. Penetration testing right at the
end of the development lifecycle is no longer substantial
enough. To avoid security breaches and reputational
damage similar to that experienced recently by Sony
and Apple, you must test at the application layer too.
So CIOs can manipulate digital transformation to
their advantage and help the business in outstripping
its competitors. Leverage tools, understand new
technologies, focus on a multi-channel customer
experience, work in a flexible manner and, above all,
develop a deep understanding of how these elements
can both align with and drive the main objectives of the
enterprise.
Reference
The action plan – fight your corner
Do not fear. The above might seem impossible to
overcome but it isn’t. I believe there are two different
angles to consider here:
1. Tools and technology
Whilst tools do create challenges of their own, they can be
enormously useful if you have the correctly skilled resources
in place. Automation and accelerators are great for speeding
up the testing process, and if your team can define an
adaptable automation framework to address the rising need
for regression and compatibility testing across devices
then you will begin to reap the benefits in the short term.
Using Service Virtualisation with test environments
will allow you to test earlier which will reduce the number
of costly defects you find later in the process, whilst the
cloud will provide you with additional capacity for your
environments and the option to spin them up, create
templates, and use them almost instantaneously. Gone are
the days of waiting a long time for a new environment.
44
Today’s CIO
1. Published annually by Capgemini, Sogeti and HP
Further information
Sogeti is a leading provider professional technology
services, specialising in Application, Infrastructure and
Engineering Services and delivering cutting-edge testing
solutions. OneShare is our unique, cloud platform and
toolset for managing development and test environments
and supporting agile, CI and DevOps. Agile services include
adoption and roadmap workshops, development, testing and
training. Sogeti Studio for mobile testing provides access
to experienced onshore resources and regularly refreshed
devices, on demand. Our solutions are innovative, scalable,
customisable, and designed to benefit your business. For
more information please call us today on
+44 (0) 20 7014 8900, or email [email protected].
Technology & Innovation
The CIO route
to business hero
By Derek Britton,
Director, Micro Focus
IN THE LAST FEW YEARS, CIOs
have been under pressure to escape
the technology trap and become
‘business heroes’. While the sentiment
is widely accepted, the reality is no
mean feat. It requires a balancing act
between integrating new, disruptive
technologies that deliver innovation,
while continuing to extract maximum
value from core IT assets and managing
operational challenges. This must
be managed while also navigating a
changing landscape of C-suite roles
and responsibilities, which is confusing
the strategic impact of the CIO.
But help is at hand in a pragmatic,
practical guise: IT modernisation can
act as an effective bridge that helps
the CIO tackle these hurdles and
reclaim their role as business hero.
Integrating disruptive
technologies
According to Gartner, “CIOs face the
challenge of straddling the second
era of enterprise IT and a new, third
“digitalisation” era.” Cloud, mobile,
BYOD, big data, virtualisation and the
ever increasing demands of the end
user are now must-haves. IDC predicts
nearly one third of total IT spend will
focus on these new technologies. Many
commentators agree on the continued
expansion and proliferation of mobile
technologies and the rising belief
that, in a maturing digital market, this
is now the ‘age of the application.’
CIOs are tasked with moving their
business into the new IT era and digital
world to capitalise on innovation
strategies. It means their organisation
must more easily ‘consume’ core
IT services, both for employee and
customer purposes. The rub is that
many of these services rely on trusted,
longstanding applications that typically
sit on a mainframe, making their
integration and alignment with new
disruptive technologies vital if both
investments are to be maximised. An
application modernisation strategy
can turn the mainframe into an
enabler of disruptive technologies,
helping businesses take advantage
of mobile, BYOD, big data and the
cloud, while delivering a solution
proven to be cost effective.
With new channels being added
to IT environments to enable
mobile and internet capabilities,
Today’s CIO
45
Technology & Innovation
CIOs are also facing increasing pressure to support new
devices that can access applications held on IT such as
the mainframe. The problem is that text-based, function
key 3270-based applications, referred to as ‘green screen’
apps, have outdated user interfaces that turn users off.
Research* has uncovered the full extent of this issue.
Of the 590 senior IT decision-makers in organisations
with mainframes that have green screen applications,
more than half didn’t feel that their applications – on
which their business processes depend – were doing
‘a good enough job.’ The survey highlighted the need
for applications to improve drastically or businesses
could face losing out to the competition. By deploying
a modernisation strategy, such apps can enable new
devices to ‘plug in’ to the current infrastructure and
make new mobile and BYOD implementations a more
viable, low cost and low risk prospect – without the need
to code. With mobile, web and desktop connectivity,
access becomes a 24-7 business opportunity.
Tackling operational challenges
The growing complexity of the new IT world, together with the
demand for immediate results from business stakeholders
means CIOs must find a way to efficiently tackle the dayto-day operational challenges, freeing up time to deliver
the volume and quality of services that businesses need.
But this ‘time to value’ equation has two tough obstacles
standing in the way – IT backlog and compliance.
IT backlog, or ‘IT Debt’ as it was coined by Gartner back
in 2010, relates to the backlog of incomplete or yet-to-bestarted IT change projects. A recent study saw 590 CIOs
around the globe estimate it would take an average of $18
million to address their maintenance backlog and bring
their application portfolio up to date – up from $11 million
18 months before. Gartner estimates that within the next
12 months, the global IT backlog, in dollars terms, will have
grown from $500 million in 2010 to reach $1 trillion. Aside
from the obvious cost implications, this work demands
resources which would be better utilised for innovation
projects needed to maintain a competitive advantage.
In addition, compliance requirements like ISO27002,
Basel III, FACTA and SEPA merely complicate the matter.
Regulatory compliance is a pressing concern for many IT
departments, but it takes time, effort and prioritisation to
update the necessary applications. And on top of that, it
takes focus away from delivering what really matters back to
the business. Governance, risk and compliance projects are
unplanned, non-negotiable IT milestones with far reaching
consequences. Meeting regulations with finite IT resources
is a challenge that limits the ability to focus on innovation.
In both cases, more and more IT leaders are implementing
a modernisation strategy, using technology that transforms
mainframe application analysis, development, testing and
workload optimisation. By continually but gradually changing
and updating core business applications through software
development and testing, organisations can turn legacy
into legendary applications that keep up with business
demands, while meeting the time to value challenge.
46
Today’s CIO
The changing shape of the C-suite
The evolving nature of businesses is reflected in the
boardroom.
Three years ago, Gartner predicted that by 2017 Chief
Marketing Officers (CMOs) would spend more money
on IT than CIOs, triggering a continued debate about the
transfer of power and the marginalisation of IT. More
recently, the emergence of the Chief Digital Officer (CDO)
has introduced a new dimension to the debate around
the role and responsibilities of the CIO. During 2013 in
the UK, the number of CDOs doubled to 500, and they
were forecast to reach 1,000 by the end of 2014.
A recent global report by Ernst & Young, which
surveyed 300 CIOs and 40 C-suite executives confirmed
there is a strong need to refresh some of the outdated
perspectives that other executives still hold about the
CIO role. Related to this is a perception that CIOs have a
higher regard for the value that they bring to the business
than that seen by their C-suite peers. For example, while
60% of CIOs strongly believe that they help enable
fact-based decision making in relation to corporate
strategy, just 35% of their C-suite peers agree.
The research highlighted the impact of this dichotomy.
Too few CIOs are currently regarded as true members
of the executive management team, with less than
one in five holding a seat at the top table and less
than half involved in strategic decision-making.
If CIOs are going to deliver on the potential remit
of the role, the potential of IT and secure a position at
the boardroom table, there is an imperative to break
out of the technology trenches and deliver businessled innovation that will capture the attention and
support of the executive management team.
To succeed in an ever more complex and challenged
role, CIOs need to find smart, innovative ways of keeping
pace with technology and change, while maximising the
value of core IT assets. Modernisation is the strategy
that’s turning aging infrastructures into innovationready IT – and CIOs into business heroes.
* Research commissioned by Micro Focus
Author information
Derek Britton is an IT professional with over 20 years
software industry experience. A computer science graduate
from De Montfort University, Derek has held a variety of
software engineering, consulting and product management
positions in the IT industry.
Derek is the Director of Product Marketing at Micro
Focus, the leaders in enterprise application modernisation,
where he is responsible for value proposition and messaging
across a portfolio of software products including the Micro
Focus Enterprise and COBOL solutions. Find Derek on
Twitter at @derekbrittonuk.
B U S I N E S S I S E V O LV I N G .
Cloud, Mobile, Digital and Security are changing your
priorities and the way projects are delivered.
The customer is king and thorough Testing and QA are more important
than ever for achieving competitive advantage with a first rare user experience.
Let Sogeti guide you through these changes – our world leading Testing solutions are
innovative, scalable, customisable and designed to benefit your business.
Leader - Application
Testing Services
(Gartner, 2014)
Leader - TransformationFocused Testing Services
(NelsonHall, 2014)
www.uk.sogeti.com
Leader - Outsourced
Testing Services
(Ovum, 2014)
+44 (0) 20 7014 8900
Be the Master of the
IT Infrastructure
Are you sure you are getting all your emails at the moment? What about
that expensive virtual environment – is it really working to its potential?
If the internet seems a little slow today, is someone in your organisation
downloading large video files again?
Why not have a look at PRTG Network Monitor, a great piece of
German engineering?
PRTG allows your IT personnel to keep track of the availability and performance of your entire IT infrastructure and intervene quickly where necessary.
MOBILE
APPLICATIONS
HARDWARE
VIRTUAL ENVIRONMENT
SOFTWARE
PRTG will enable significant improvement to your staff’s productivity and your department’s reputation within your organisation.
Stop wasting time fighting fires – start becoming the IT master!
Paessler AG
[email protected]
www.paessler.co.uk
513708/EN/UK/20150211
Technology & Innovation
Wireless “induction” charging –
the definitive guide
By CMD Ltd
WITH THIS GLORIOUS TECHNOLOGICAL
bloom we are currently in, we are
seeing more and more futuristic
developments every day. The front
runner of this pioneering time is the
wireless charging module, currently
developed to provide additional battery
juice to those flagging smart phones.
Smart phones in the 21st century
are a mission critical device in
the day to day success of every
company, be it a small enterprise or
an international corporation. This
has seen some downright crazy and
bizarre developments in smartphone
technology. Let’s be fair, who would
have considered talking to your phone
in order to conduct a web search? On
the flip side this dependence on mobiles
has seen some pretty helpful and now
essential advances, although sadly
the battery life of the aforementioned
phone hasn’t been one of them.
One thing that has changed
drastically however from the phones
of old is the quality of phone charger.
The trusty Nokia 3310 never required
you to buy a new wire every 4 weeks
at £10 a piece did they? Nor did they
have a detectable wire that is ever so
easily lost or left behind preventing
your phone from receiving the
sweet nectar of electrical charge.
This, in my humble opinion is why
smartphone wireless development
has been at the fore front of the
“wireless revolution”. Wireless
charging maintains safe, reliable,
cable free transfer of power ensuring
each and every mobile device is
forever charged and ready to go.
What is wireless charging?
Wireless charging also known as
“Induction Charging” is the process
of electromagnetic energy transfer
between two devices. To explain this
in terms that are palatable, wireless
charging allows you to charge a
compatible device by simply placing
it onto an induction charging module
without the need for cables or wires,
hence the term wireless charging.
Basically wireless charging
does exactly what is says on the
tin, now that’s all well and good
but how does this magic work?
Well let’s take a look shall we…
The principles of wireless
charging & how it works
Now I understand that to many
“induction charging” might seem like
something from Hollywood science
fiction, but it’s actually nowhere near
as far fetched as you think. In fact
the technology behind this invention
has been kicking about for a while.
Nikola Tesla, for those who don’t know
who this man is, was a Serbian American
inventor from Croatia, who first set out
the principles of wireless charging. He
demonstrated wireless power as early
as 1891 and even claimed to wirelessly
Today’s CIO
49
Technology & Innovation
power an electric car in 1930 which achieved speeds of up
to 90 mph. More recently, before wireless chargers hit the
market, this tech was seen in everyday items like electric
toothbrushes. In a nutshell it’s how they keep their charge.
However I digress, in short wireless charging is the process
in which an electrical current is transferred between two
objects through coils. If you prefer the more technical
explanation, induction chargers use an induction coil to
generate an alternating electromagnetic field from inside the
charger whilst a second induction coil takes the generated
power in the form of the electromagnetic field from the
charger before converting it back into the standard electrical
current to charge your portable devices battery. Simple.
Qi – what is it?
For many people listening in on the buzz generated by the rise
of wireless charging you will be more than familiar with Qi.
But what does it mean? What’s it about? And how do I use it?
Well Qi (actually pronounced as Chee) is quickly becoming
internationally recognised as the standard by which all smart
phones with wireless capability must adhere to. If something
has a Qi wireless charging logo it means you can quite simply
drop your compatible device onto the wireless charger and
it will charge without the need for adapters or cases.
One of the main reasons for Qi development is to generate
a standard and symbol that people from all countries can
understand as the sign for wire free compatible charging,
while Qi is hoping their logo will become a key consideration
in the buying cycle for new smart phone consumers.
To finish off it’s a lot safer with no exposed connection
but most of all this technology is only in the infant
stages of its life. If it stays on this same path in
the near future we will see Qi or induction chargers
popping up all over in towns and cities whilst the same
tech will be transferred into many other devices.
Don’t get left behind and stay ahead of the game.
Compatibility
In theory every smart phone should be capable of
wireless charging with some kind of assistance. Be
this through a coil adapter or a compatible case.
Again this all depends on your phone and we always
recommend consulting the manufacturer to check.
n Smartphone Case
Smartphone cases which turn your ordinary phone
into a wireless wonder are available from a number
of third party suppliers. They contain within them the
technology to transform any phone it fits to support
wireless charging. These come in a variety of shapes
and sizes and charge just like a normal phone.
n Coil Receiver
A coil receiver or wave receiver is a small module
used in conjunction with a wireless induction charger
such as porthole III to allow a smartphone without
QI to charge wirelessly. The Wave receiver includes
an Apple 30-pin lightning charger and a Micro-USB
charger. All you have to do is place the appropriate
connector in your phones charging port and drop
the Wave on top of any wireless charging device.
Why choose wireless charging?
Well there are a number of brilliant benefits of induction
wireless charging, the first and foremost being that it
removes the hassle of USB and cables. Meaning you don’t
have to remember to pick your charger up, no losing that £10
charging wire and no searching for a free socket to plug in and
juice up. Off the back of this it also means your USB slots in
your on desk modules will be free to use on something else
or will be used less therefore increasing their shelf life.
Whilst on the subject of wear and tear it’s worth pointing
out that wireless charging reduces the amount you use
your charging port. You don’t have to keep plugging and
unplugging the charging cable from your device, saving
your charging port, which we all know is something that
can easily be broken on any modern day smartphone.
With this set up your electronic devices are all
enclosed within their respective units meaning they
are away from the external environment. As a result
there is less chance of corrosion from water or the
atmosphere meaning safe and longer lasting charging.
One over looked benefit is the fact that it’s so easy to
charge. This in itself is a benefit, but it leads to a much
greater benefit. You will never have low battery. How many
times have you looked at your phone with 35% and thought
it could do with a little top up, but you don’t want to get your
charger out and plug it in for a quick 10 minute charge? With
an induction charger all you have to do is drop it on for a few
minutes to give your smartphone that extra boost it needs.
50
Today’s CIO
The future of wireless charging
Wireless charging is currently available for lowpower devices and uses, predominantly up to 5 watts.
This is more than adequate for a mobile phones and
devices of that persuasion however the future is
moving towards the higher powered devices such
as fridges, freezers, televisions and even cars.
To bring this a little closer to home in the near future
you can expect wireless charging to jump to tablets and
laptops offer larger coils and a rapid charging experience.
Either way it’s safe to say now induction charging has
gotten a foot hold within our life’s it’s here to stay and
will only grow bigger, better and stronger over time.
Further information
Porthole III
What makes the Porthole III wireless charging module
unique is that CMD have developed a wireless charger that
has an integral power supply so there is no requirement for
a separate USB device to power the module.
This is our very own induction charger and can be used
alongside our wave receiver. Please visit www.cmd-ltd.com
for more information.
Your business
only knows one
speed—fast.
63% of business leaders believe
IT is responding too slowly.
Don’t be left behind.
Learn how the best IT teams are
transforming their function.
Visit cebglobal.com/AdaptiveCIO.
Technology & Innovation
IT monitoring and
business productivity
Dirk Paessler discusses
how IT monitoring
can help to increase
business productivity
AS THE UK CONTINUES TO EMERGE
from the recession, IT has a more
significant role than ever to play in
stimulating economic growth, and
increasing productivity in business.
A study by the Centre for Economic
and Business Research has found
that since the 1970s, technology has
improved efficiency amongst office
workers by 84 per cent. As businesses
start to capitalise on the early stages
of economic recovery, investment in
IT infrastructure is starting to become
a priority once more. IT investment is
growing according to data from Gartner
Inc, with global IT spending set to grow
2.4% to $3.8 trillion in 2015. Enterprise
software in particular is set to see some
of the strongest growth according to
forecasts, rising 5.5% to $335 billion.
However, it could be argued that
this increase in expenditure is not
worth it if underlying inefficiencies in
a company’s existing infrastructure
are not detected and addressed. By
employing an IT monitoring device,
businesses can work towards tackling
these fundamental problems. By
identifying anomalies and bottlenecks,
an IT monitoring tool can address issues
that may be impacting on the speed
of the network, and could potentially
cause network failure if left unchecked.
Supporting investment in IT
IT monitoring software does not remove
the need for investment in IT, but it
can certainly help to direct it and make
it more accountable. By providing
detailed insight into the current status
of a network, an IT monitoring tool
can act as a guide to understanding
Today’s CIO
53
Technology & Innovation
which technologies will provide a
boost to the existing infrastructure.
When a company invests in new
technology aimed at improving
network capability, the product is
expected to work seamlessly, as well
as delivering tangible productivity
gains in a short amount of time. It is the
IT team’s responsibility to make sure
this happens, and with the support of
a network monitoring tool, potential
barriers to effective implementation
can be recognised and limited, therefore
ensuring that the hardware delivers
on all of its anticipated benefits.
Assuring productivity
But it is not just hardware that needs
to be effectively monitored. Office
productivity applications such as
business process management
(BPM) software must also be taken
into account. BPM is a fast-growing
area of investment. WinterGreen
Research has predicted that the
$3.4bn worldwide market will be
worth over $10bn by 2020, as BPM’s
expansion looks set to lead to lower
costs and increased agility. In order
to ensure that businesses take full
advantage of applications that can
help them work more productively, an
effective network infrastructure is
key. IT monitoring software helps to
guarantee this reliability by ensuring
that the network has enough capacity
to accommodate new technologies.
Harnessing innovation
Emerging innovations in the field of
IT look set to present a number of
interesting opportunities for business
in the coming years. Developments
such as the Internet of Things provide
enormous prospects for gains in
productivity, by increasing the speed
of communication and delivering
huge advances in automation. In
the manufacturing sector, the
transformation being inspired by
these technological advances is so
significant that it is being referred to
as ‘The Fourth Industrial Revolution’.
For example, the German engineering
company Siemens is developing the
factory of the future, in which all
parts can communicate with each
other, without the need for a human
54
Today’s CIO
intermediary. The result could be a car
that is capable of talking to the factory;
for example, the vehicle could selfdiagnose a fault, indicate that parts are
required, and influence changes in the
building of the same model in the future.
This revolution offers enormous
opportunities to business, but at the
core of ensuring that the potential is
realised is the steadfast management
of a company’s IT network. IT
monitoring will increasingly play a
vital role in the roll-out of these new
technologies, by making sure that any
faults are registered and addressed
before they can have a negative
impact on network performance.
Helping IT deliver
This increase in automation, as well as
an extended reliance on technology
across all sectors, demands a greater
emphasis on network management,
in order to act as an early warning
system should anything go wrong. As
networks become ever more complex
and investment in IT continues to
grow, any organisation looking to take
full advantage of the productivity
gains today’s technologies and
tomorrow’s innovations can bring
must view investing in IT monitoring
as central to their business.
Author information
Dirk Paessler is CEO and founder of
Paessler, an IT monitoring software
developer based in Nuremberg,
Germany. Paessler leads the
industry in providing the most
powerful, affordable and easy-touse network monitoring and testing
solutions. The company’s software
products deliver peace of mind,
confidence and convenience for
businesses of all sizes, including
more than 70% of the Fortune 100
companies. Founded in 1997 and
based in Germany, Paessler’s global
reach includes more than 150,000
active installations of its products.
TODAY’S CIO
O
TODAY’S CF
Spring/Summer 2015
Do I really need
to worry about
data centres?
the
Adjusting to
cheap oil
new world of
tomers
Are your cus to be?
m
who they clai
s,
Move over boy n
women mea
business too
Sponsored by
Rotterdam
image
trades up its
■
■
■
56
www.spartapublishing.co.uk
Tel: +44 (0)20 7970 5690
40 Bowling Green Lane, London EC1R 0NE
Today’s CIO
TODAY’S CE
O
Spring/Summer
er 2015
Spring/Summ
2015
Ahead in the cloud:
how to keep your
data secure
A sharing eco
no
utilises spare my
capacity
Bishops and
actresses off
er
help to MBA
students
How safe is
data
stored in the
clouds?
Cutting you
carbon footpr r
can cut costs int
too
Sponsored by
Learning & Development
Does risk matter?
1
Disengaging from a ‘self-evidently correct’ process
Dr. Elmar Kutsch and Dr. Neil Turner,
Cranfield School of Management
OUR PROFESSIONAL LIVES ARE CHARACTERISED BY
risk and regular expressions of ‘may’ and ‘might’. Longing
for certainty in, and controllability of, our environment is
commonplace. Increasingly we tend to rely on a process that
is often advocated as ‘self-evidently correct’: the (supposed)
self-fulfilling prophecy of modern Risk Management. Applied
consistently, it serves any context – including IT systems –
and promises gains in efficiency and the comforting ability to
plan our way through the risks in our projects. The major issue
with this, though, is us! We often fail to ensure that we actually
do what is so regularly promoted as ‘good’. We seem to
disengage from that ‘goodness’, leaving us vulnerable to risks.
Is this really true, and, if so, what are the reasons behind it?
It is common for Risk Management to be described
as the single most important process to apply, yet it is
also often cited as the one that goes horribly wrong in IT
projects (e.g. 2, 3). Risk management is designed to assist
in controlling the risks on a project by either preventing
them from occurring or managing the impact if they do, so
that overall objectives can be met. As with other project
controls, it is prescribed in practitioner frameworks and sets
of internal organisational processes. The risk management
frameworks such as those described by the APM4 or PMI5 are
a continuous cycle of identifying, assessing, and responding
to risks. They are designed to be iterative throughout the
life of the project and assist in actively controlling risks.
The purpose of the identification stage is to recognise
potential risks to the project and its objectives. Identification
involves a number of techniques designed to help
stakeholders consider potential positive and negative events,
such as risk checklists, lessons-learned logs from previous
programmes and projects, risk identification workshops,
cause and effect diagrams, brainstorming, and gap analysis.
Assessing and prioritising risks often includes the variables
of probability and impact. These can be quantitative or
qualitative in nature, and often a mixture of the two will be
used throughout the risk register. Finally, the response to
risk may involve actions to reduce the probability of the risk
occurring, the mitigation of its impact, the transfer of that risk
to another party, or just a toleration of the potential effect.
Hence, managers should transform a world of uncertainty
into measurable and controllable risks, by following a
process from risk identification to closure through a
suitable response. So it is advocated. However, although
this makes sense, in our research we observed that that
this is not what many managers actually do. A large number
do not follow the process through to the ultimate stage of
responding to the identified risks, thereby leaving them
more vulnerable. Roughly a third of all risks associated
with critical incidents in IT projects were not actively
managed, with the biggest drop – surprisingly – at the last
stage of the risk management process – responding.
The disengagement at the process stage of
responding to risk is worrying. The reasons are
manifold, but three aspects stand out:
The Lure of Positivity
Knowable risks
100%
Identification
Assessment
Response
94%
80%
Not actively
managed risks
77%
60%
40%
56%
Lure of Positivity;
Lure of Noncommitment;
20%
0%
Figure 1: Extent of disengagement
Deterrent of Powerlessness
Actively
managed
risks
A risk is a concept often associated
with negativity. So, at times, is
the response to it. Responding
to a risk may be interpreted as
an acknowledgement of failure,
the failure to prevent the risk
from existing in the first place. A
response is therefore the obvious
consequence of that failure. As
a result, in order to maintain a
positive ‘image’ of a project – one
characterised by the perceived
absence of risks – managers may
choose to ignore it rather than
respond to it. A response is a visible
action, under the scrutiny and
watchful eyes of stakeholders. Not
responding to a known risk, however,
emphasises ‘out of sight, out of mind’.
Today’s CIO
57
Learning & Development
The Lure of Noncommitment
A risk is an event that has not happened yet, but Risk
Management asks us to commit a response to it. By default,
we like to keep our options open, and in particular to a
fictional risk. Hence, deferring a response until the risk
actually materialises as a real issue is convenient. This
propensity to defer a response is powerful, and this helps
explain why many known (identified, assessed) key risks
were responded to with a severe delay, or even not at all.
Deterrent of Powerlessness
Knowledge is power, as the adage goes. Yet, knowing more
does not necessarily mean that we can exercise power
over our environment. Some of the managers involved
in our study raised the issue that they ‘knew’ a range of
risks, but felt unable to exercise any control over them.
They felt powerless. It is this perception that explains
another form of disengagement from risk response. Time
and effort has been spent on identifying and assessing
risk, yet there the process seems to stop. The apparent
lack of control associated with a specific risk can lead to
little, and in most cases no, dedicated resources (money,
time, etc.) being allocated to the specific risk response.
Consequently, many known risks remained unmanaged
and therefore had the potential to derail the IT projects
we studied. This does not imply that the projects we
looked at all ended in disaster. Quite the opposite, in fact.
Managers showed great skill in managing major issues
and recovering from crises in a timely manner. That’s not
the point, though. Isn’t it a considerable waste if we spend
scarce resources on identifying and assessing risks, in
accumulating ever-greater perfection in forecasting,
if we fail to use our newfound knowledge wisely?
Overcoming these problems is difficult and requires
changing the discourse around Risk Management. We
need to look beyond the mere application of ‘process’. In
order to address the Lure of Positivity, risks need to be
considered ‘normal’ and responding to them understood as
something ‘good’. Changing the atmosphere around risk and
its management is central to this. Negative connotations
need to be downplayed, instead senior leaders must
create and reinforce a culture that incentivises managers
to embrace and engage with risks, rather than ignore
them to maintain an illusionary ‘risk-free’ environment.
Acknowledging risk and the need to respond to it requires
courage and is by no means an indicator of bad planning.
Overcoming the Lure of Commitment is as challenging
as that of Positivity. The costs of a risk response are
immediately visible, whereas the outcome – the risk
event may never actually occur – is not. Responding to
a risk is a ‘leap of faith’, believing that doing something
about it will have the desired impact. Indeed, with the
benefit of hindsight, many responses may not in fact
have had the planned effect, a result of the complex
environment in which they are applied. Managers need
to support action, even given the inherent uncertainty
regarding cause and effect, under the premise that wellchosen responses are generally better than inaction.
58
Today’s CIO
Finally, the desired controllability does not come
out of thin air. The Deterrent of Powerlessness can be
tackled by widening managers’ response repertoires
and increasing their empowerment to deal with a
wide range of risks via greater authority to allocate
resources (e.g. manpower). This allows greater flexibility
to respond appropriately to the risk at hand.
Risk Management – and its failure – is often thought to
be associated with a lack of knowledge. We have found in
many organisations that issues arising in projects were
in fact identified and anticipated, just not dealt with
effectively. The focus of our attention needs to be the on
the identification and assessment of risks, but with an
organisational culture that focuses on pragmatic responses.
Projects are complex and cannot be fully planned in
advance, despite our best intentions. Acknowledging this
at all levels of the organisation and acting accordingly,
thereby remaining engaged with the full risk process, is key
to better project performance. And we all want that.
References
1. This synthesis is based on the paper E. Kutsch et. al. (2013). “Does risk
matter? Disengagement from risk management practices in information
systems projects (European Journal of Information Systems); which
won the Stafford Beer Medal, in recognition of the most outstanding
contribution to the philosophy, theory or practice of Information
Systems published in the European Journal of Information Systems.
A more practitioner-oriented version is also available under E. Kutsch
et. al. (2014). “Bridging the risk gap: The failure of risk management
in innovative IS projects.” (Research Technology Management 57).
2. Nelson, R.R., IT Project Management: Infamous failures, classic mistakes,
and best practices. MIS Quarterly Executive, 2007. 6(2): p. 67-78.
3. Cerpa, N. and J.M. Verner, Why Did Your Project Fail?
Communications of the ACM, 2009. 52(12): p. 130-134.
4. Association for Project Management, Project
Management Body of Knowledge, ed. Anonymous. 2005,
London: Association for Project Management.
5. Project Management Institute, A Guide to the Project
Management Body of Knowledge, in 5th Edition. 2013,
Project Management Institute: Newtown Square, PA.
Author information
Dr. Elmar Kutsch (Senior Lecturer in Risk Management) and
Neil Turner (Senior Lecturer and Director of the Executive
MSc in Programme and Project Management) both teach
at Cranfield School of Management, a leading international
management school, and one of an elite group of schools
worldwide to hold the triple accreditation of AACSB,
EQUIS, and AMBA. Established in the 1960s, the School is
renowned for high-quality teaching and research and strong
links with industry and business. Above all, it is known as
a school that provides relevant management solutions. The
School is one of the world’s largest providers of executive
development programmes for practising managers.
CRANFIELD IT
LEADERSHIP PROGRAMME
Become a more effective IT Leader. Gain the credibility and confidence
to discuss and influence business strategy with your Board.
Key Benefits:
• A broadening of your knowledge base and personal effectiveness
• Improved leadership skills to advance your career
• The ability to shape organisational direction as well as drive business value through IT
• New insights from assignments that have practical relevance to your organisation.
ng
rmi
o
f
e
ns
Tra wledg n
o
o
kn acti
into
For more details call 01234 754570 or email [email protected]
www.cranfield.ac.uk/som/todayscio2015
For a hassle-free visa service
for business or pleasure
to the following destinations:
n
n
n
n
n
China
DRC
Ethiopia
France
Germany
n
n
n
n
n
Ghana
India
Nepal
Nigeria
Oman
n
n
n
n
n
Pakistan
Russia
Tanzania
Ukraine
Vietnam
Visasforothercountriesmaybeavailableonrequest
SecondUKpassportforfrequenttravellers
RenewingexpiringBritishpassports
Weworkbothwithcorporateandindividualclients
Schengenvisaconsultations
Contact us on
+44(0)2078375803
[email protected]
www.cayostravelvisas.co.uk
Today’s CIO
59
Advertisers’ Index
Absolute Software
29
Bergvik Flooring AB
20
BeyondTrust
26
Cayos Travel Visas
59
CMD Ltd
55
Corporate Executive Board
52
Cranfield Management Development Ltd
59
Data Centre Alliance Ltd (DCA)
Elsevier Ltd
Imation
Integral Memory plc
OBC, 16, 22
2
IFC
6
Micro Focus
51
Munters Ltd
22
Nexthink
13
Ormuco
56
Paessler AG
48
Qlik
Rausch Netwerktechnik GmbH
Schneider Electric
Sogeti UK Ltd
8
14
4
IBC, 47
Titania Ltd
30
Volta Data Centres Ltd
19
Wick Hill Ltd
34
Xtravirt Ltd
11
Zscaler Inc
60
Today’s CIO
32, 33
B U S I N E S S I S E V O LV I N G .
Cloud, Mobile, Digital and Security are changing your
priorities and the way projects are delivered.
The customer is king and thorough Testing and QA are more important
than ever for achieving competitive advantage with a first rare user experience.
Let Sogeti guide you through these changes – our world leading Testing solutions are
innovative, scalable, customisable and designed to benefit your business.
Leader - Application
Testing Services
(Gartner, 2014)
Leader - TransformationFocused Testing Services
(NelsonHall, 2014)
www.uk.sogeti.com
Leader - Outsourced
Testing Services
(Ovum, 2014)
+44 (0) 20 7014 8900
Delivering the action plan
for the data centre industry
Become a member today...
www.datacentrealliance.org
+44 (0) 845 8734587
www.datacentrealliance.org