Faster ways to find the right video Acoustic fibre optics in wells What

Faster ways to find the right
video
Acoustic fibre optics in wells
What you can do with seismic in
the cloud
Is better subsurface knowledge
the key to improving the
financial viability of offshore
operations?
February / March 2015
Issue 53
Official publication of Finding Petroleum
015
vents 22014
EEvents
Non-seismic Geophysics
London, 19 Feb 2015
Finding Enough Oil & Gas in NW Europe
London, 12 Mar 2015
Transforming Sub-Surface Interpretation
Aberdeen, 17 Mar 2015
Doing more with Subsurface Data
Aberdeen, 18 Mar 2015
Transforming Sub-Surface interpretation
London, 13 Apr 2015
Doing more with Subsurface Data
Stavanger, 05 May 2015
Doing more with Production Data
Stavanger, 06 May 2015
Finding Oil in Atlantic Basins
London, 27 May 2015
n
issio
m
d
A
£20
m
o
r
f
Find out more and reserve your place at
www.d-e-j.com
Find out more and reserve your place at
www.findingpetroleum.com
www.d-e-j.com
Deeper knowledge leads
to wider opportunities
by David Bamford
If we do not act quickly, the UK Continental Shelf (UKCS) and
Norwegian Continental Shelf (NOCS) will soon be on life support.
Issue 53
February / March 2015
Naturally, first reactions have been “Give us a Tax
break!” and “How do we get Costs way down?”
Future Energy Publishing Ltd
Yes, these are important because they finish up in the
Numerator of the crude economic equation that describes profitability.
39-41 North Road,
London, N7 9DP, UK
www.d-e-j.com
Tel +44 (0)208 150 5292
Fax +44 (0)207 251 9179
But there is also a Denominator which is, or are, barrels
of oil or cubic feet of gas. How do we input more of
these into the equation?
Editor
Karl Jeffery
[email protected]
Tel +44 208 150 5292
I have two thoughts:
Beginning with a story. Many years ago I had a minor role in BP’s takeover of Britoil
(previously of course BNOC, the UK government’s national oil and gas company).
Conference Producer
Panas Kalliantas
[email protected]
Tel +44 208 150 5295
This takeover was underpinned by profound understanding of North Sea geology, of Yetto-Find volumes, of undeveloped discoveries, of upcoming development projects, of producing fields.
Advertising, event sponsorship and
Exhibitions Manager
In point of fact, BP probably understood Britoil’s acreage and fields better than Britoil
did itself.
Richard McIntyre
[email protected]
Tel +44 (0) 208 150 5291
This 1986 – 1988 work was almost entirely analogue – paper composite logs, paper seismic sections, hand-drawn maps, tracing paper, light-tables, occasional use of the “digitizing table” (remember those?). It worked!
Production
Wai Cheung
[email protected]
Subscriptions:
£250 for personal subscription, £795 for corporate
subscription.
E-mail: [email protected]
Today it is difficult, impossible actually, to see such a profound underpinning anywhere,
perhaps because lots of key individuals have ‘moved on’, perhaps because of lazy assumptions that the North Sea’s best days are somehow behind it.
And yet the significant Johan Sverdup discovery in the NOCS, in a well-explored area,
was as I understand it, the result of deep geological knowledge and innovative thinking.
We know perfectly well how to do these things – see for example this summary of how
work on Nova Scotia revitalised exploration there. www.qz.com/318755
Something similar, of similar scope and imagination, is needed for the North Sea and, arguably, NW Europe as a whole. Somehow this has to be a ‘multi-client’ study, driven and
delivered by oil and gas industry folk, not some academic or research exercise.
Secondly, we geoscientists have developed a lazy dependence on ‘yet-another-towedstreamer-3D-seismic-survey’ which we need to move beyond. There are all sorts of new
technologies ‘out there’, from seismic nodes, passive seismic, fibre optics, full tensor
gravimetry, electromagnetics, that can tell us much more about the sub-surface, bringing
better predictions, and higher volume successes.
In principle, digital technology should allow us to be more efficient and effective in integrating diverse sources of existing data and new multi-measurements, thereby allowing
coherent interpretation from the scale of whole basins to individual producing reservoirs.
This ‘deeper knowledge’ will indeed lead to ‘wider opportunities’.
Front cover art by Alex McKenzie, artist,
Digital Energy Journal
Just one snag……..
I haven’t yet seen a digital offering that can do this!
Printed by Printo, spol. s r.o., 708 00 Ostrava-Poruba, Czech
Republic. www.printo.cz
David Bamford is a director of Digital Energy Journal and Finding
Petroleum, and a non-executive director of
Premier Oil
February / March 2015 - digital energy journal
1
Leaders
Repsol and IBM - “cognitive” computing in E+P
Spanish oil major Repsol is working together with IBM to build a $15m-$20m system to answer difficult questions about which
licenses to bid for and how to optimise production, using experts together with big data
By Santiago Quesada, Repsol’s director for exploration and production technology
develop cognitive technologies that can
analyse subsurface data in order to drive improvements in exploration and production.
ing cognitive limitations posed by big data.
Scientists in the CEL will also experiment
with a combination of traditional and new interfaces which are based upon gesture, robotics and advanced visualisation and
navigation techniques.
Based at IBM's pioneering Cognitive Environments Laboratory (CEL), the researchers
will work on two prototype applications
which are specifically designed to increase
Repsol's strategic decision-making in the optimisation of oil reservoir production and in
the acquisition of new exploration areas and
production fields, both onshore and offshore.
Through these modalities, researchers can
leverage sophisticated models of human
characteristics, preferences and biases that
may be present in the decision-making
process.
Oil companies continue to make high-stake
decisions in the face of increasing uncertainty and geological risk based on extremely
complex data sets.
Repsol is making an initial investment of $15
million to $20 million to develop two applications with early results targeted for late
2015.
The technology will also introduce new realtime factors which should be considered such
as current news events around economic instability, political unrest and natural disasters.
Cognitive computing systems can help in exploration and production by helping individuals to better interpret big data and then
make informed decisions based on that data.
The team will work together in New York
and Madrid, with each company committing
six to 10 employees to develop the technology.
As a result, companies can maximise access
to better exploration areas, increase the productivity of maturing oil fields and their
value, enhance safety and mitigate environmental risks.
People, devices and spaces
These tools are not intended to replace the
key stakeholders such as geologists, geophysicists, engineers, investment managers,
risk analysts and corporate strategists, but to
assist them with building more fluid conceptual and geological models, highlighting the
impact of the potential risks and uncertainty,
visualising trade-offs and exploring what-if
scenarios.
Santiago Quesada, Repsol’s director for exploration
and production technology
Until recently, geoscientists have been tasked
with mostly manually reading and extracting
information from enormous amounts of data
including journal papers reports, seismic data
and models of reservoirs, wells and facilities.
Recognising the need for an intelligent solution, Repsol and IBM, leveraging years of
existing collaboration, recently teamed up to
To best achieve this, the cognitive computing
technology infrastructure has been designed
to specifically interact with people across
various devices and physical spaces.
For example, the technology is able to
process questions asked by humans in natural
language and sifts through information to respond with the most likely answers.
This, in turn, will enable individuals and
teams to make better decisions by overcom-
The new applications developed by Repsol
and IBM will improve the way oil companies
visualise and develop exploration and production activities.
It is envisioned that companies from other
sectors will set up their own CELs to make
better informed decisions and, ultimately, increase their companies’ bottom lines.
Oceaneering’s Video Vault - faster video retrieval
Oceaneering has developed the Video Vault cloud-based solution to make it easier to store, share and find the right piece of
subsea and offshore video
Subsea engineering company Oceaneering
has launched a cloud-based live video
streaming and archiving solution called
Video Vault to make it easier to store, retrieve, view, share and analyze the video you
are looking for.
To simplify future video retrieval and analysis, video files can be tagged geographically,
with comments or subtitles, or according to
the time it was recorded. This means you can
2
develop systems to quickly pull up
video relating to a certain location or
operational significance, and view
both real time and historical data.
The use of video in offshore operations, for both subsea and surface
monitoring, has grown massively in
recent years, and brought a lot of
challenges for finding the right video
image from hours of video files, says
digital energy journal - February / March 2015
Subsurface
Mark Stevens, Director of CommunicationsGDS, Oceaneering.
Usually Oceaneering has about 100 simultaneous video feeds across the company.
The data files are getting larger all the time,
with more high definition and 3D cameras
and high-resolution video compression formats.
Meanwhile there are many limitations to
physical media (such as DVDs, USB
drives). It is hard to manage and share files.
Common Operating Picture (COP)
Oceaneering’s core aim is to help oil and gas
personnel create a ‘Common Operating Picture’. This is a term which originated in the
military, which means a view of all of your
operations on a (usually large screen) map,
where you can drill in to see what you need.
you can automatically bring up the video inspection image of a certain section of
pipeline.
A lot of companies are asking for video of
drilling data, particularly looking at the drill
floor, to record the drill pipe going through
the floor of the drill ship and back out again.
Viewing the data
In terms of viewing the data, about 30 per
cent of clients use mobile devices. “That
continues to be the trend,” Mr Stevens said.
It means staff can monitor offshore activities
from wherever they are, so less staff are required to be actually offshore.
To search for video on Video Vault, you start
with a Google Earth or ESRI map interface.
The Common Operating Picture serves as a
dashboard of all of your activity, showing
the locations of all your vessels and rigs.
When video is added to the COP, users can
click on any of numerous monitored assets
to see associated recorded video.
You can see all the live video streams with a
location on the map, or you can draw a box
around an area and see all the video you
have for that region. You can browse the entire video library with advanced search, preview and indexing, using thumbnails to
speed identification of interest areas, and
then play back selected video with the convenience of fast- and slow-forward, fast- and
slow-rewind, and pause capabilities.
You can view a thumbnail of all your real
time video streams, displayed in the appropriate position of the map, refreshed every
30 seconds.
Clients often ask to connect different files to
the video at appropriate points (such as pdf
documents and pictures), which you can do
with metadata content management.
You can say “show me all the video in the
Gulf of Mexico”, then click on an individual
ROV in that area and say “follow”.
You can provide usage data such as which
employees were watching the video and how
much bandwidth they consumed, as well as
total bandwidth utilization across the organization.
Some oil companies (including Statoil) have
their own 3D visualisation platform, and can
view this data in the COP, as well.
Benefits of video
Oil and gas companies are using video in
many different ways.
Subsea, companies use video for monitoring
ROV operations, observing rotary brush
cleaning operations, subsea X-ray imaging
(digital radiography) and general long term
asset monitoring.
The video is proving particularly useful in
subsea cleaning work. You can capture
videos before and after cleaning, to see how
effective the cleaning was, and then work
out a better cleaning schedule.
If the video is of a pipeline inspection, all of
the video can be geographically tagged, so
There are tools to deliver a short video file
to your desktop – so if you want to watch a 5
minute section of a 4 hour video, you don’t
need to download the whole thing.
Archival infrastructure
The Video Vault solution is made available
as a hosted cloud service for a tiered, dailyrate or monthly price depending on application, including maintenance. It is also
available as a bundled hardware solution for
on premise installation. This means you can
store the data on your local servers, or you
can host it on a standard commercial
archival service like Google. Clients never
have to see any physical storage media.
If you are concerned about storing your data
on standard commercial cloud services, you
could use Video Vault with Amazon’s “Gov-
Have a “Common Operating Picture” of your operations
- drill down to get further information and live video
Cloud” service, designed specifically for
more sensitive data.
Open standards / commercial systems are
used as much as possible (including integration with Google Earth and ESRI) to give
clients as much flexibility as possible. There
are “application programming interfaces”
(APIs) enabling you to connect Video Vault
with other software tools.
There are 10 software developers working
on the project, out of a total team of 55 developers at Oceaneering.
Geospatial Data Integration
Video Vault’s ability to integrate geospatial
data means that the GPS co-ordinate of the
asset being monitored is automatically embedded into the captured live video data
stream. For subsea monitoring, the GPS coordinates of the topside vessel is used.
For some subsea projects, transponders have
been laid on the vessel floor, which enable
the gathering of location information from
the seabed.
Larger ROVs have an internal inertial map
(NAV) system, which estimate the ROV’s
location based on how it is moving through
the water. “It can calculate to within inches
where the ROV is,” Stevens says.
In June 2014, Oceaneering acquired a company called PortVision®, which has a vessel
tracking service and 5 year database of vessel locations around the world derived from
automatic identification systems (AIS)data.
Oceaneering is now connecting the vessel
information with the video – if you know
about the date and which asset the camera
was recording, you can put this together
with AIS data to work out the location.
February / March 2015 - digital energy journal
3
Subsurface
Inova - mini-vibes, low fequency and UAVs
Interesting recent developments in land seismic recording technology include ‘mini-vibes’, low frequency seismic, and UAVs
for wireless seismic quality control, says Andy Bull, VP Product Development, Emerging Technologies at INOVA Geophysical.
But we may be reaching the limit of the number of seismic channels we are able to record without significant advances in
computing technology.
INOVA has developed a range of small Vibroseis trucks, known as ‘mini-vibes’, which
can create seismic energy in areas where
standard Vibroseis trucks can’t go.
The usual way to create seismic energy on
land is by using large Vibroseis trucks which
provide 60-80,000 lbs of force.
Inova produces the UNIVIB and UNIVIB 2
Vibrator trucks, which provide up to 26,000
(UNIVIB) or 34,000 (UNIVIB 2) pounds of
peak force, but have the advantage of being
much easier to use in urban areas, or areas
with more difficult terrain, said Andy Bull,
VP Product Development, Emerging Technologies at INOVA Geophysical.
He was speaking at the Finding Petroleum
conference in London on November 26,
“New E&P Technologies”.
“These little (mini vibe) machines are capable of really nice manoeuvring, low environmental impact and tremendous broadband
energy,” he said.
“These machines can output super signals
over a bandwidth of 1-400 Hz and at significant depth,” he said.
In one example, data acquired in the South
Georgia Rift Basin
(USA), using 2 UNIVIBtrucks, with data at 6 to
76 Hz. “They have been
able to bring out a structure at 10,000 feet,” he
said.
The market for mini-vibes
is partly driven by the increased efforts oil companies have to make about
to limit the environmental
impact of exploration, he
said.
This includes thinking
about how much line cutting (path clearing) they
need to do, how much dynamite they are going to
Mini vibroseis machines - "capable of really nice manoeuvring, low environmental
use, and what to do about impact and tremendous broadband energy"
wildlife.
The environmental impact is further reduced
if you use cable-less technology and small vibroseis trucks, he said.
Because they have a shorter length and
width, less line cutting is required, he said.
They can also move up slopes of difficult terrain.
INOVA Geophysical is a manufacturer of
land seismic technology, formed from assets
from ION Geophysical and BGP in March
2010. It produces a range of seismic recording equipment and seismic source equipment.
Low frequency seismic
INOVA is seeing an increasing customer interest in working with low frequency seismic.
“We have been involved in several projects in
various locations where we starting with frequencies as low as 1.5 Hz, and we think we
can go down further,” he said.
MEMS (Microelectromechanical systems)
sensors are “ideal for [recording] this low frequency data,” he said. The sensor systems are
designed to be small (107g), low power
(85mW) and tightly integrated with acquisition systems.
“We're still learning a lot about the benefits
of these low frequencies.”
Data acquired in the South Georgia Rift Basin (USA) using 2 UNIVIB trucks was able to bring out a structure
at 10,000 feet
4
digital energy journal - February / March 2015
Mr Bull showed an example of data surveyed
by Petroleum Development Oman (OMAN),
where during the survey, the company decided to extend the bandwidth down to 1.5
Subsurface
“By integrating UAV’s with cable-less systems, powerful status QC and noise monitoring can be achieved simply and even more
efficiently without the need for complex
radio infrastructure,” he said.
But there are still regulatory obstacles to
using UAVs in many countries though,
he said.
Big data
Inova's G3i HD cabled seismic recording system - designed for high productivity
Hz, and you can see how the data improves.
You can see just the data generated from 1.5
– 6 Hz seismic and it is very clear.
The company has also done some downhole
testing with sensors recording at up to 7,500
feet, recording data as low as 0.5 Hz.
Downhole data was also recorded at high frequencies, resulting in coherent 200 Hz energy visible at over 5,000 ft depth.
Seismic technology market
At the time of his talk in November 2014,
Inova was already seeing oil and gas companies make reductions in their exploration programs due to cost constraints due to the low
oil price, he said.
But the Middle East is “still a healthy area.
“There's a lot of investment still going on, a
lot of surveys planned,” he said.
Annual spending on seismic technology has
been fairly flat for the past 10-15 years, with
a small uplift in the mid-2000s with a move
to larger surveys.
“This creates difficulty for companies like
ourselves,” he said. “We're helping the market acquire much more data, of better quality,
but the volume of spend is not increasing.
Many companies are starting to look seriously at broadband seismic acquisition, as
they look for oil at greater depths, in more
complex geology, and with better resolution
images, he said. Or they want to do “full
waveform inversion,” using the two-way
wave equation to produce high resolution velocity models.
Cabled and wireless
In terms of land seismic acquisition systems,
80 to 85 per cent of new seismic system sales
are still cabled, rather than cable-less.
“The costs [of cabled systems] are reducing,”
he said. “The cable system is still substantially less per channel than cable-less system.”
Cabled systems like INOVA’s G3i HD need
to be able to run surveys at all kind of scales,
be integrated with other types of sensor (for
example with waterborne sensors for ‘transition zone’ surveys), or used with cable-less,
and productivity is very important.
Keeping productivity high “is a huge focus
for us,” he said.
Meanwhile there is growing interest in cableless technology for environmental and safety
reasons, he said. It requires smaller crews.
The power consumption is reducing all the
time, so now devices are capable of 20-30
days operation in the field from one battery
charge.
With a cable-less system like INOVA’s
Hawk, you can do powerful quality control
and noise monitoring simply, via wi-fi, he
said.
UAVs
Companies are now showing interest in developing unmanned aerial vehicles (UAVs)
which can fly around the spread of wireless
devices, download data over wi-fi and do
basic quality control on it, he said.
Purchasing a fixed wing UAV, which can
carry a 5 lb payload and fly for 2 hours, costs
about $150,000, he said.
But they might be able to quality control 910 lines in a 2 hour period, compared to 1-2
lines using conventional methods.
There is a lot of talk about increasing the
number of channels (individually recorded
data streams) in a seismic survey, but people
have perhaps not taken into consideration
how they will communicate and store all the
data, he said.
With 250,000 channels, your data rate is 4 to
5 gigabits per second.
“That's a tremendous amount of data coming
up the cable. The system has got to handle
this.”
Typical tape drives can record 160 to 250
megabytes a second, and a solid state disk
can store 500 to 600 megabytes a second.
Disks in a “RAID” array could store 3 to 4
gigabytes per second.
But your data recording speed can be reduced by many factors including the processor, input-output, memory, cache, quality
control. “You see that throughput onto your
recording device really comes down,” he
said.
“We’re right on the edge of what's capable
and what's supportable in the field,” he said.
“It’s a very big challenge - taking this acquisition up to 500,000 and beyond that.”
In terms of data storage, 500,000 channels
could generate up to 86 terabytes of data over
24 hours. Over 6 months, that’s 12 petabytes
of data storage (one petabyte = 1000 terabytes). A petabyte costs about $250,000
today. “It’s a massive cost”, he said. “We're
going to be really challenged to solve this.”
“Maybe we'll see a bit of plateau of what
people need in terms of channel count until
they understand how to handle this volume
of data.”
View Andy Bull’s talk on video and download slides at
www.findingpetroleum.com/video/1095.as
px
February / March 2015 - digital energy journal
5
Subsurface
Getting more value from seismic
In the current financial environment, oil companies need to improve the way they get maximum value from their seismic. Jane
Hodson explained how to do this
The best way to get the most value from seismic data is to have a strategy for managing
the data at the beginning of the project, said
Jane Hodson, head of technical services at
DataCo, and formerly lead subsurface technologist with Centrica, speaking at the Digital Energy Journal Aberdeen conference on
November 27, “Better ways to Manage Seismic Data”.
“Who quality controls the seismic
when it is uploaded? Is it the geophysicist? Is it the data manger?”
“That will give a more value upfront than
managing it at the end. “
“I've had people coming to me and Delegates at Digital Energy Journal's Nov 27 Aberdeen conference
saying, 'I can’t see my seismic',
"Better ways to manage seismic data"
that they've loaded. I said, ‘You’ve
loaded it in Norway, but it's in the UK sector, external one you need to make that decision.
because you got the wrong UTM.’” So the
rule is get it loaded by someone who actually “Companies spends millions acquiring seismic, and then they forget to do the critical
knows how to do this not everyone has that
publishing of the final piece of work.”
skill.
Good data management is a key to opening
up that potential and we have excellent supply of subsurface data management expertise
in Aberdeen, Ms Hodson said. “With this we
can be a global player in this field, we have
the domain knowledge, we have the people
and we have the technology.”
As well as reducing the amount of budget
companies have available, the low oil price is
also likely to drive company towards acquisitions, mergers and sales and even decommissioning, she said, which all leads to big
demands on data.
With this current downturn in the market we
have to start looking at ways to work differently, work smarter, faster and work with the
data we already have,” she said.
Seismic data at the right time
Having well managed seismic data starts
with how the data is delivered into your system in the first place, she said.
A key question is who in the company first
receives the data.
“Does new seismic data get delivered to your
data management team, to the IM team, or
does it go straight to the geophysicists?” she
asked. “Does it get catalogued?”
“How do you know that they have
loaded it correctly or have QC it?
It’s imperative this is done, because if it is not then this happens.”
If you get the data catalogued, loaded, verified and ready to interpret then you have
done your job right.
“In the end it is essential you have a detailed
catalogue of your data, whether that is seismic or well, or even documents. “
Dry wells or worse
“Have your archive strategy in place, because you never know when you might need
to access it again.”
“If you don't get the process right things start
to happen you wish had not.”
“The geophysicist does his/her interpretation,
creates all those lovely horizons and surfaces, and gives it to the geologist.”
“The geologist creates the geological model,
gives it to the reservoir engineer, and they all
decide where to drill the well. They drill the
well and then, oh, there's nothing there.”
Back to the drawing board because you did
not ensure the data was loaded correctly in
the first place. Lost time, lost money and lost
opportunity.
Data verification and data integrity are key
things to be completed before any interpretation work is carried out then hopefully you
will have less failures.
Publishing and Archiving
“Is the seismic data delivered with everything attached, such as an acquisition report,
or a loading sheet? If it doesn’t have a loading sheet, how do you know what all the
data is?”
Do you have the naming standards in place
so you can identify the type of seismic you
have?
6
Whether or not you find oil, the seismic interpretation data needs to be captured and
published.
“And when all this is done, and the entire
package is been tied up quite neatly, the last
bit is to archive or sell! Whether it is archiving in your own internal storage system or an
digital energy journal - February / March 2015
“As a final statement it means in the future
you can check your archives instead of
spending another five million pounds or
more on reshooting or purchasing seismic,
when you may have it already.”
IT
“So the geophysicist is on the workstation,
and wants to access the data, but [the network is] so slow it’s driving them crazy.”
“You can do as much as you want to clean up
those data, but if the [network] you have got
is not up to the job, then what’s the point?”
“Work with IT. Make sure that you have
everything in place like network connectivity. So when the geophysicists are actually
interpreting the data, they don't have to wait
10 minutes for each inline to display.”
Geophysicists are a valuable commodity in
themselves so wouldn’t you prefer they had
the tools to deliver the projects on time then
sit around waiting.
You can view Jane’s talk on video at
www.d-e-j.com/video/1228.aspx
Subsurface
OCTIO - business case for permanent reservoir
seismic
It may be easier to justify spending on permanent reservoir seismic if it is used for overburden monitoring, or monitoring drill
cuttings injection, rather than to monitor the reservoir, says Helge Brandsaeter of Octio
There have been a number
of well publicised cases in
Norway where a water injection well put more pressure into the rock than it
was able to handle, leading
to craters opening up in the
subsea, he said.
Fields in the Barents Sea
have shallow reservoirs, so
there is higher risk
Octio produces a system with permanent digital cables on the seabed, which
can make regular seismic surveys and monitor how the reservoir is changing.
Using digital cables on the seabed, to record
seismic data and get a picture of how the oilfield is changing as it is produced, is generally thought to enable an increase in oil and
gas recovery of 5 per cent over the lifetime
of the field, said Helge Brandsaeter, president of OCTIO, speaking at the Digital Energy Journal Stavanger conference on
December 10, “People and Subsea Data”.
But from an investment point of view, systems are not being installed because too
often it is seen as ‘nice to have’ rather than a
necessity, he said.
“You see faulting going
from the reservoir to surface directly,” he said. In
reservoirs like this, “The
overburden is as important
as the reservoir."
In Brazil, the overburden
is “young in geological terms” which means
that if water is injected at too high pressure
you can open pre-existing faults to the
seabed, and water and oil seeping to the
seabed.
By monitoring the seabed with digital cables, you can see how the rock is fracturing
and if there is a chance the fracture will
reach the surface. “It’s not difficult to see
that seismic can map any such seepage,” he
You can see how the cracks develop subsurface. “This will give the operational teams a
notice that something has to be done by the
injection,” he said.
“It is reasonably easy to set up a business
proposition which is reasonably sound,” he
said.
Drill cuttings
The system is being used on the Oseberg
field (140km Northwest of Bergen), to monitor the injection of drill cuttings and waste
water.
Drill cuttings are milled very finely at the
platform and injected into the reservoir.
This provides a much less expensive option
for managing drill cuttings than transport
back to land.
“The average cost for a North Sea field for
transportation of cuttings and water waste is
close to $20m a year,” he said. “The alternative is to ensure safe injection, for $2m to
3m.”
But Norway has “zero tolerance” for any
waste pollution. “If you can't ensure safe injection you have to transfer all the fluids onshore,” he said.
Seabed infrastructure
Most oil and gas staff are busy meeting their
short term objectives, and don’t have time
for longer term ones.
Octio typically installs a mesh of seismic
sensors on the seabed, 50m apart, which
communicate with a hub and send the data
to surface.
But the short term business case can be
much stronger if you see it as a way to monitor the overburden (rock between the reservoir and seabed) and avoid costly
environmental fines, he said.
One system has 172 sensors in a W shape
around the well on the seabed.
The sensors have 4 components, 3 component accelerometers and hydrophones.
Octio is majority owned by Statoil Technology Invest.
Overburden problems
The seismic recording can help spot problems leading to possible overburden leakage.
Developing a better business case for permanent reservoir seismic monitoring: Helge Brandsaeter, president,
OCTIO
The sensors have an ‘active’ mode for
recording seismic data in a survey, and a
‘passive’ mode for continually listening to
seismic data.
Everything is managed remotely, with data
said.
February / March 2015 - digital energy journal
7
Subsurface
transmitted back to shore.
The installation of the system basically
comes down to the cost of leasing vessels,
and installation cost is a third of the lifetime
operating costs.
The data communications infrastructure can
also be used for any other subsea equipment.
“If you build such an area wide infrastructure we can use it for all types of communications and types of sensors,” he says. “You
can drop down a sensor and communicate to
surface.
“We have made basically an ethernet on the
seafloor, you can interface any system to
us,” he said.
Octio is working with a number of standardisation committees including SWIG (Subsea
Wireless Group) and SIIS (Subsea Instrumentation Interface Standardisation).
Watch Helge’s talk on video at
www.d-e-j.com/video/1502.aspx
Using acoustic fibre optics in wells
UK company Optasense reports that enormous progress is being made with using fibre optics in wells for recording seismic
data, monitoring, fracking and flow, among other applications.
Making progress
with recording
seismic in wells
with fibre optic –
David Hill, chief
technology officer,
Optasense
Using fibre optics to record seismic data in
wells has been done since 2010, but there
has been enormous progress made since
then, both with the technology and market
acceptance, said David Hill, chief technology officer of UK company OptaSense,
speaking at the Finding Petroleum London
conference on Nov 26, “New E&P Technologies.”
Seismic recording with optical fibre using a
technique called Distributed Acoustic Sensing (DAS) in wells is not as sensitive as conventional geophones, but the target is to
reach that level, he said.
Some people say the oil and gas industry
takes 30 years to adopt a new technology,
but the technology has already been widely
implemented in 5 years, he said.
You don’t need to drill a special observation
well, you just install a fibre-optic cable in an
existing producer or injector well. If there is
already a fibre-optic cable in the well (perhaps installed to monitor temperature), you
can use that.
The DAS technology has been used on one
offshore well so far, operated by Shell in the
Gulf of Mexico. Shell was conducting a
large seismic survey in the region and
wanted to see if it could get any value from a
8
fibre-optic cable which had been installed in
one of the wells a few years ago for a different purpose, he said.
“We were asked to attach our box to it, to
see if we could get any usable seismic.”
The recorded data was so good, that “one of
our employees had to spend seven weeks on
that platform, including over Christmas,
recording 50,000 shots.”
1400 channels were recorded simultaneously, from two 6km fibres.
In another project, in a mature field in
Oman, the customer wanted to monitor
where injected steam was going, so they
could plan an infill well strategy. To do this
fibre was installed in eight wells, a task
which took about a month, and then a 3D
DAS-VSP (Vertical Seismic Profile) was acquired simultaneously on each well
The data was correlated and stacked as it
was being recorded, enabling quality control
do be done immediately.
After seismic shooting, the data is being
stitched together to get a field wide view, he
said.
The same system can be used for time lapse
seismic, comparing a seismic survey today
with the seismic survey at some time in the
past.
For this to work, the fibre probably needs to
be fixed firmly in the well (cemented or attached to the casing or production string),
not dangled on a wireline.
The technology could also be used to monitor CO2 storage wells, he said.
digital energy journal - February / March 2015
The repeatability of the recorded data helps
people gain confidence in it, he said.
The fibre can also be used for microseismic,
recording natural seismic energy, which can
also be used to understand the subsurface.
Recording passive seismic data in an oil well
is not a new idea. It is conventionally done,
by drilling an observation well and inserting
geophones in it, or installing the geophone in
an existing well. The work is “risky and expensive”, he said.
In one example, fibre was installed on a
highly deviated well, which meant that by
triangulation you could work out where the
source was.
Optasense currently has 180 employees and
offices around the UK, Houston, Calgary,
Dubai and Australia. It recently acquired two
California based companies; RIO (Redfern
Integrated Optics), which produces the special laser which sends the light through the
fibre, and SR2020, a specialist in high definition borehole seismic imaging and interpretation.
How it works
The technology works by firing a pulse of
light (laser) into the fibre.
The glass fibre is “the purest material man
has ever made,” but there is enough inhomogeneity in the molecular structure to cause a
small amount of light backscatter, he said.
The backscatter appears to be random, but it
stays relatively constant if the fibre is not
disturbed.
Subsurface
But if there is a tiny strain on the fibre,
which can include a strain caused by a noise,
the backscatter pattern changes slightly.
With a calculation involving the speed of
light, you can calculate which part of the
fibre that event happened.
rate of oil and see the trends.
“The technology has nowhere near fulfilled
its full capabiilty yet.”
Sometimes wells gradually fill with water,
and then the oil suddenly pushes past it,
sending the water back into the reservoir.
You can monitor what is happening across
inflow control valves (ICVs) and at the gas
lift values (GLVs) in order to make sure they
are working properly.
The processing work will typically divide a
length of fibre into a number of sections and
work out the sounds which could be heard at
each section of the cable (so it is equivalent
to one seismic channel). For example a 5km
fibre can be split into 500 x 10m sections.
You can also monitor pressure. “In the future
we hope this technology will replace gauges
downhole,” he said.
The fibre can be installed by attaching it to
the production string, or cemented behind
the casing. It can also be run into the well on
a wireline or slickline. It has been used on a
well 7km long.
The technology has been used to monitor
electrical submersible pumps (ESPs),
analysing the noise to see rotational speeds,
and spot for any cavitation effects in the
motor.
The fibre-optic cables are very robust.
“Fibre is actually stronger than steel,” he
said. “These cables have been developed
over 20 years.”
In fracking, you can check the perforation
charges are firing properly, and then monitor
the flow through the perforations. You can
also listen to what is happening with a frac
job on a neighbouring well.
The fibre-optic cable can handle temperatures of up to 300 degrees C.
The fibre response is quite directional, and
not so sensitive cross axis – so it cannot determine which direction a sound is coming
from. One possible solution is to wind the
fibre around the well, he said, or try to develop a perpendicularly sensitive cable.
Fracking and flow
The DAS acoustic data can be used to monitor what is happening in the well, including
fracking operations, monitoring flow, water
loading and valve operations.
For example you can see a rising water level
in the wells. You can also record the flow
Sometimes frackers push proppant into an
adjacent well, potentially blocking it, he
said.
It has been used to make sure plugs are set
properly, because of a particular sound signature which is heard when a plug is set.
Sometimes, the system detects wells having
sudden leaks, releasing fluid at high pressure
and then closing. This is something a standard sensor would probably not capture, he
said.
The system can’t get do multiphase flow
measurements “but there are strong indications it might be possible,” he said.
Data
The company is trying to come up with a
standard way to define the data, so the data
can be transferred between systems.
The data files can be enormous, with one
well generating “a terabyte a day without
any problem,” he says. “That's only going to
get orders of magnitude worse as technology
progresses.”
To keep data files manageable, it is essential
to process the data at source, so you are only
transferring the much smaller processed data
files, he said.
In 2015, version 4 of the OptaSense DAS
system will be made available, with an extra
6dB signal to noise improvement and better
spatial resolution, he said.
Other applications
Optasense’s biggest business application for
the technology so far is for pipeline monitoring. It is currently installed on 12,000 km of
pipeline, to detect potentially damaging activity along the pipeline and detect leaks.
The technology is also used to monitor and
installed around factories to monitor for people climbing over or cutting fences.
It is being used for condition monitoring, to
monitor condition of risers.
View a video of David’s talk at www.findingpetroleum.com/video/1060.aspx
CGG - time for seismic on the cloud?
For economic reasons if nothing else, the oil and gas industry is likely to start moving its seismic data to the cloud, says
CGG’s Henri Blondelle
Many companies in the retail and banking
sector moved their data to the cloud in the period 2008-2009, driven by the need to reduce
costs after the 2008 crisis, said Henri
Blondelle, VP Global Business Development
CGG Data Management Services.
Perhaps now oil companies will move seismic
data to the cloud for the same reason.
He was speaking at the Digital Energy Journal conference in Aberdeen on November 27,
“Better Ways to Manage Seismic Data”.
The benefits of cloud data are well publicised,
but perhaps none of them motivate activity as
much as short term economic gain.
The financial case of storing data on cloud
servers, rather than on your own, is
quite strong.
If you want to store data in-house, your investment in buying storage equipment is initially quite high, followed by a small
additional cost each year for new data storage. So your purchasing costs are uneven.
You might choose to buy new equipment
February / March 2015 - digital energy journal
9
Subsurface
One seismic data has
been moved to the
cloud, you can gain
many extra benefits,
such as being able to
do Hadoop processing
and automatic data indexing - Henri
Blondelle, global business development
manager, CGG Data
Management Services
again after 3-5 years.
Using a cloud or provider you can keep the
cost the same every year, and the cost will
also decrease as the cost of storage equipment
decreases. You only pay for the storage you
need, and can increase or decrease it on demand.
“I have seen a lot of pilots, a lot of proof of
concepts, which for me prove that the industry will be ready in the next few months to
absorb this challenge [of putting seismic on
the cloud],” he said.
Once the data has been moved onto the cloud,
the industry can gain many further benefits,
already used by other industries, such as the
ability to run Hadoop based interpretation on
it, he said.
CGG works together with Microsoft Azure as
a cloud service provider, and proposes its customers a package with CGG Data Management services running over Microsoft Azure.
Security and control
Many oil and gas companies have chosen not
to move data to the cloud due to security concerns, an argument made stronger by many
recent cloud hacking stories.
To try to mitigate these concerns, cloud service provider Microsoft Azure recently conducted a survey of its existing customers to
ask them what they thought about security
after having moved their data to the cloud.
“All of them considered that the security,
after moving to the cloud, is better than before,” Mr Blondelle said.
“A majority (but not all) users consider that
they have more control on the data on the
cloud than with their previous (internally
hosted) solution,” he said.
There are many ways cloud service providers
can improve security, such as storing data encrypted, ensure the security of the data centre,
10
and find ways to transfer the data securely, he
said.
Some companies might have legal requirements to store data in a certain country, or
might choose not to put some data on the
cloud because of slow internet connections.
In one example, Statoil used Hadoop cloud
storage to make correlations between changes
in seismic data and pore pressure calculations.
Well logs and drilling data
Some of these problems can be resolved by
having a partial cloud solution, he said.
CGG ran a pilot project in mid-2014 together
with Teradata mixing well logs and drilling
data covering the whole of the North Sea.
Of course the cloud system can make it much
easier to provide other parties with access to
your data as required, including governments
and joint venture partners.
It worked with a very difficult drilling data
set, mainly pdfs of well reports, with mud
density data on them. “It’s very difficult to
extract this information from a pdf,” he said.
Easier to work with
The project team also tried to automatically
extract data from well logs and well completion reports.
One often overlooked advantage of putting
seismic data in the cloud is that it is easier to
work with in many ways, he said.
For example, you can create automatic tools
to monitor the quality of your data, and run
business rules. One oil company has 6,000
business rules.
You can display data from many different
servers together, for example if you have separate databases for well logs and seismic data
files.
You can keep track of all the data processing
tasks which have been done, which is very
helpful if you have to discover an error somewhere in the process you want to roll back,
and have had several interpreters working on
data simultaneously.
After extracting the data, efforts were made to
try to find correlations.
Some of the correlations were as expected.
“The drillers know already the formations
which are difficult to drill,” he said.
But there was also some detailed information
which could be very useful. “In this part of
the North Sea, for this particular formation, if
you use this particular mud density, you have
80 per cent of chance to have a bad hole section,” he said.
“There is a possibility of bringing this quantitative information to the people who have to
make a drilling program.”
Indexing
You can use sophisticated cloud based analytics tools which can interpret text.
You can use analytics tools such as Tibco’s
Spotfire or Tableau Software. These tools
make it easy to share the results of the analytics – unlike if you do the analytics running
over Petrel, which means that someone needs
to login to a Petrel workstation to view them.
Many interpreters “prefer to use something
that looks like Excel - like Spotfire or
Tableau,” he says.
Teradata and Horton Works provide a range
of statistical tools you can use.
You can store your seismic data in a Hadoop
based storage solution, which gives you the
ability to do some Hadoop analytics without
downloading the data, including all the seismic processing and interpretation.
digital energy journal - February / March 2015
CGG has also looked at ways to make seismic data easier to catalogue and index automatically.
Indexing data “the classical way” generally
involves opening it, reading it, and entering
the relevant information in a database, which
can be time consuming, if you have millions
of files to index. As a result the task is often
not done.
So perhaps if the documents could be
analysed and indexed automatically, oil companies would index documents which otherwise would not be indexed at all.
These tools are already used by US law firms
to automatically classify reports on
cases, he said.
View Henri’s talk on video at
www.d-e-j.com/video/1223.aspx
Subsurface
Using surprise in subsurface knowledge research
Until now, enterprise search systems have focused on precision – helping you find exactly what you want. But our internet
search engines are moving more and more towards other ways of guessing what you might want. And surprising you with
what they find. Should enterprise search move in the same way? By Paul Cleverley and Simon Burnett, Robert Gordon
University, UK
Exploratory search
Subsurface search
needs to guess what
people might be looking for - and surprise
them with something
else - Paul Cleverley,
researcher at Robert
Gordon University
The classic internet search engine, digital library and enterprise search have traditionally
focused on precision and ranking.
Exploratory search is where the question is
not fully formed in the mind of the searcher.
This is different to ‘known item’ (or lookup)
search.
It is possible the actual need may in part be
stimulated by the search engine itself, with
the search engine acting like a creative member of the team making suggestions from initial inputs.
Faceted search
The rationale is that as long as the specific
web page or document you were seeking is
on that first page, it does not matter how
many results are returned.
This approach has been incredibly successful,
leading to Internet search engines like Google
attracting a crowd nearing one billion users a
week, of which 94 per cent never click past
the first page of search results.
But increasingly with Internet search, smart
algorithms recommend or suggest related information, trying to predict what we need or
may find interesting.
In addition, social networks undoubtedly aid
discovery. However, some researchers feel
the overuse of historical usage and activity
data within algorithms to make suggestions
may place us in a ‘filter bubble’ constraining
some potential serendipitous encounters.
Enterprise search
In an enterprise environment, significant frustration still exists where the success seen on
the Internet seems harder to replicate inside
an enterprise.
Factors for unsatisfactory retrieval include investment levels, organizational culture, the
nature of workplace tasks, information governance and interventions, small crowds, information structure and permissions along with
information behaviours of staff and management.
Faceted search shows a breakdown of what
exists in the search results by various categories with counts, normally shown on the
left hand side of the screen inviting further
human interaction to browse and filter results.
used to gather survey data from 54 petroleum
engineers from over thirty oil and gas industry organizations.
A need was identified for the ‘surprising’ as a
search filter.
The research found the most statistically frequent associations (to search terms) were
often “too vague and no promise of telling
me anything I didn’t already know”, “relevant but not interesting” and “contained few
surprises”.
However, algorithms such as mutual information measure appeared to generate more intriguing associations “useful for deep dives”,
“might learn something” and “high on interestingness quotient, you can’t say where
these results may lead you”.
Algorithms for surprising
These may be potentially useful options when
you consider most enterprise searchers enter
two words or less, searching increasingly
larger haystacks of information, so most
searches deliver hundreds or thousands of results.
Further research presented at the International
Conference on Knowledge Management used
discriminatory word co-occurrence techniques surfacing potentially ‘surprising’ associations to search terms.
But, it is difficult to represent the richness of
a 50 page report with 6 topics. Furthermore,
the same information item will always be represented by those same 6 topics, regardless of
what search terms are used and where relevant matches are found inside the document.
Initial results were promising. In an observational study of 53 geoscientists in two oil and
gas organizations, 41 per cent felt current
search interfaces used by their organization
facilitated serendipity to a moderate/large extent, increasing to 73% with the introduction
of certain algorithmically generated filters.
One method to provide contextual based
topic filters is word co-occurrence - using
words that appear in proximity to the search
terms found in documents.
As put by one participant “It’s like open up
the box for me and I’ll pick what does not fit
with my brain, like one of those games”.
Need to be surprising
Recent research by Robert Gordon University
published in the Journal of Information Science identified certain information needs with
respect to faceted search refiners.
Research was conducted using word co-occurrence stimuli generated from data provided by the Society of Petroleum Engineers,
Geological Society of London and the American Geological institute. The stimuli was
Surprising and serendipitous encounters occurred giving rise to learning experiences, “It
is clear I underestimated the importance of
carbonates in… this is immediately important
for the research I am undertaking now”.
Surprising associations can be unusual words
or quite common words but appearing in an
unusual or discriminatory context.
For example, “What is interesting is that
Halite is there for the Permian, but technically it could occur for Tertiary, Jurassic,
(others), what is surprising is that it has not”.
February / March 2015 - digital energy journal
11
Subsurface
This may be detached from any initial specific intent, the surprising nature of the association enticing the searcher to drill down
further which may lead to a serendipitous encounter.
Enhancing creativity
one person, may not be by another as suggested filter terms are compared with their
own cognitive map, like a game of spot the
difference.
The challenge with text co-occurrence is to
decide what to present to the user, minimizing distraction but offering potential surprises, combining with traditional controlled
vocabulary (taxonomy) metadata approaches.
What is deemed ‘surprising’ or ‘intriguing’ by
Companies that adopt such practices, may experience more “happy accidents” in the user
interface than those which do not.
Paul Cleverley and Simon Burnett are
researchers in the department of Information Management at the Aberdeen
Business School at Robert Gordon University in Aberdeen, UK.
Hadoop for oil and gas
Hadoop, a new data storage technology, promises to make high volume data much easier to manage in the oil and gas
industry. Alyssa Farrell of SAS explains how it works
Hadoop is a new data storage technology
which may lead to a big change in how the oil
and gas industry manages big data.
High-volume data, whether structured or unstructured, can be placed into an enterprise
data hub (EDH) on Hadoop and prepared for
further analysis without the time-intensive
steps associated with the traditional relational
database model.
With a Hadoop “cluster”, both structured and
unstructured data can be managed for use
over long time horizons, in original fidelity,
and integrated with existing infrastructure and
tools.
With Hadoop, organizations have a new way
to think about data, transforming it from a
cost to an asset.
Geophysicists are already evaluating Hadoop
as a technology to support processing seismic
data from a multidimensional perspective,
complementing the traditional processing sequence.
“New technologies that reduce the latency of
data for analysis, such as cloud applications
and Hadoop, are significant game changers
for the oil and gas business,” said Moray
Laing, SAS Executive lead for Oil and Gas,
formerly at Baker Hughes.
“As a result, this industry is on the precipice
of major change in their IT architectures.”
vey of more than 400 practitioners about their
big data efforts.
Respondents were asked which database
management systems (DBMS) were in use
for big data management efforts.
Analytics on Hadoop
While traditional relational DBMS systems
were at the top (38 percent), Hadoop was tied
for second place (33 percent).
SAS and Cloudera recently announced technologies that move the analytic functions directly within a Hadoop cluster.
In addition, it was clearly evident that the
phrase “big data” was synonymous with
Hadoop, in the minds of those surveyed. A
meteoric rise for a technology that only became commercially viable in the last five
years.
Deploying models directly in Hadoop reduces
data movement and replication, saving time
and resources - while strengthening data governance.
For many organizations, establishing an enterprise data hub using Hadoop will be a costeffective solution for data capture of all data,
structured and unstructured, in a secure, managed environment. When paired with additional technology applications to ensure data
quality, and to visualize and analyze the data
effectively, Hadoop is ready for prime-time.
Software companies, like Cloudera and SAS
are working together to provide processes and
technologies that accelerate data-driven insights.
Dave Cotten, whose team at Cloudera supports many US oil and gas companies, says
that “Cloudera's oil and gas clients are realizing multiple revenue generating and cost savings opportunities.
Big data trends
Recently, The Data Warehousing Institute
(TDWI) released "Managing Big Data," a report that explored trends in big data management.
The report presents the findings from a sur12
“In addition, our customers typically improve
preventative maintenance, greatly reducing
costly downtime. “
“From real-time field operations feedback improving reservoir yields, to full-fidelity electronic well record management, to mining
internal and public data to determine optimal
well spacing, customers are obtaining deeper
insights at lower costs provided by Hadoop in
an enterprise data hub.”
digital energy journal - February / March 2015
With all your data in one place, simple tabular
data can mix with more complex and multistructured data to provide business insights
never before possible.
Organizations can run a variety of enterprise
workloads, from batch processing to advanced analytics, in a secure, managed, governed environment.
Early adoption of analytics on Hadoop has
been popping up in seismology, asset optimization, commodity pricing strategies, and
production optimization.
Because SAS data visualization on Hadoop
allows companies to interactively explore billions of rows of data in seconds.
One common use case is for data validation,
finding the outliers and flagging them for further explanation.
Because you can look at both structured and
unstructured data, like Twitter feeds or web
traffic, in one place over time, there are applications for cybersecurity as well.
Subsurface
EMC and the future of seismic data
David Holmes, chief industry executive with EMC’s Global Oil & Gas Program, explained what the future of seismic data
management will look like
Could future seismic
interpretation could
be done by crowdsourcing, asked
David Holmes, chief
industry executive,
EMC's Global Oil
and Gas Program
“Companies should care more about this
stuff. They have spent millions on acquiring it
and the cost of managing it is an unmeasurable fraction of that,” he said.
“Some companies do have a regulatory obligation to keep their data in perpetuity, and it
doesn’t mean a rusty 9-track no-one can
read.”
At the recent Society of Exploration Geophysicists (SEG) event in Denver in October
2014, a company called Agile Geoscience ran
a hackathon with 30 people in a room, asking
them to write a ‘supercool geoscience application’.
“A [typical] oil company is working with five
seismic data storage companies, each with
different cataloguing systems, all incompatible, three million media items, including two
million 9-track tapes.
David Holmes, one of the judges, selected as
the winner a crowdsourcing “hot or not” tool
for other people’s seismic interpretation.
“There are a finite number of read heads for
9-track tapes,” he said. “They are not being
manufactured any more.
You log on to an online tool with your Google
account, interpret some seismic data, and then
rate other people’s choices.
Some companies “are paying $10m a year in
license fees for data they're not using but can't
prove that they're not using,” he said.
“This is the future, I'm convinced,” Mr
Holmes said, speaking at the Digital Energy
Journal Aberdeen conference on November
27, “Doing more with Seismic Data.”
The trouble is, managing data is hard work,
and it always easier not to do it. “Companies
ask, shall we spend lots of time and money
doing something hard with intangible business value, or do nothing,” he said.
There are many more exciting things the industry could do, if the seismic data systems
were on the cloud.
One US company put all of its seismic data
onto disk, and then hired 6 students from the
Colorado School of Mines and gave them access to the entire seismic library, telling them
“go and find some stuff.”
Mr Holmes said it is a “mystery to me why
we keep standalone workstation going as long
as we have,” he said.
Geophysicists still work on personal workstations, where they spend 20 minutes loading
up all their data every morning. If they could
work directly on a cloud system it would be a
lot faster.
Managing old seismic
Yet still most oil and gas companies store
their seismic data on tape, and have very little
idea what they have, if they are storing multiple copies of the same data, or do not have
what they thought they had.
You can’t just copy seismic data from tape to
disk, because the data will get corrupted. It
needs to be transferred to a different format.
Together with the data, you need to keep a
scanned copy of the tape label, logs of the remastering process (gathering data from tape).
“You need everything in your possession that
will allow you to recreate that nasty crumbly
9 track tape,” he said.
Data management processes
In the future there will also be much stricter
data management processes, he said.
Many oil and gas companies already say that
geoscientists may not load up data themselves, they must give it to a data manager, to
load it and validate it, he said.
There are software tools to make this process
easier, for example where new data is loaded
to a folder, then a data manager receives an email alert. The data manager can then check
the data formatting and co-ordinates, make
any necessary transformations and check the
headers.
Working with big data
Some physical data storage companies are
taking advantage of oil companies willingness
to take the cheap and secure option over the
short term. They offer a service where they
store your data free of charge, but charge you
big fees when you want to retrieve it, he said.
Some cloud data services are trying to get
away with the same business model. “The
cost of retrieving the data can be gigantic.”
The problem is that usage rates of seismic
data is typically very low, with only small
amounts of data retrieval over a time frame
measured in decades, he said.
But slowly, attitudes are changing, as companies realise the risk of not properly maintaining data assets, he said, and cloud solutions
offer a cheaper alternative.
Move away from tape
Mr Holmes recommendation is to move away
from tape.
A side-effect of the growth of big data systems is that many companies now have multiple systems for storing data, including their
normal archiving systems, high performance
computing (HPC) enviroments, Hadoop environments.
They might have the same data file in all of
these systems. If they back up the data in each
environment multiple times, they can end up
with many copies of the data. “One company
worked out they would have 17 copies of all
of their data, if everything had gone well,” he
said.
As data volumes get bigger, keeping 17
copies of everything will get very expensive.
“If we have any chance of surviving the next
few years, it’s going to be crucial that we
have a single instance of our data,” he said.
“Or companies will make a fortune selling
you vast amounts of storage you don't need.”
A new term has been invented, “next generation data fabric”, which describes the enter-
February / March 2015 - digital energy journal
13
Subsurface
prise architecture for storing and managing
information, he said.
Companies will also use ‘object storage’
which means that the analytics tools can understand the different data storage systems
you are using.
The idea of ‘master data management’ will be
redundant, because companies will be able to
search all of their data at once.
Geophysicists will be able to ask complex
queries, like “show me all the files I have
navigation for, which I don’t know about.” Or
in more specific terms, “show me all of the
navigation files which have a survey name
which isn't in my survey master.
“You can run that simple query against
your entire landscape,” he said.
View David Holmes’ talk on video at
http://www.digitalenergyjournal.com/vi
deo/1224.aspx
Developments at LMKR
Subsurface data and modelling company LMKR reports that it has formed a partnership with petroWEB, an oil and gas data
and information management company based in Colorado, Canada and Houston.
The agreement is for LMKR's "GeoGraphix"
subsurface interpretation system to integrate
with petroWEB's "Enterprise DB" exploration
and production data management system,
built on the PPDM model.
Enterprise DB can serve as a corporate well
master, well log repository and well file management system.
By putting Enterprise DB together with GeoGraphix, you have a single system for managing large volumes of subsurface data, LMKR
says.
LMKR has also formed a technology partnership with LUMINA Geophysical, a company
based in Houston which provides special
tools for quantitative interpretation of the subsurface, based on a mathematical method
called spectral decomposition. This allows
more geological information to be extracted
from geophysical data.
As a result of this partnership, LMKR will release a new software tool called Predict 3D in
early 2015. This is a multi-attribute inversion
solution based on spectral decomposition.
The multi-attribute inversion process predicts
rock properties from well logs away from and
in between the wells that have been used for
correlation. This helps understand the nature
of the reservoir away from the wells.
LMKR has also made improvements to the
user experience with its GeoGraphix 2014.2
subsurface software, improving seismic interpretation speed and capability, offering larger
memory access and up-front well loading optimization.
There is improved integration with depth geomodels. This means that users can easily
depth convert faults and incorporate into a
sealed earth model as well as combining seismic interpretation data and well picks into the
geomodel.
Enhancements to velocity modelling and
depth conversion make it easier to depth convert seismic with velocity models generated
from defined horizon-formation top relationships.
LMKR has developed a 3D volume attribute
generation tool, which can manage both trace
based attributes (such as sample, window and
Hilbert), and frequency based attributes. All
of the frequency-based attributes use a
patented spectral decomposition technique for
higher resolution results.
LMKR GeoGraphix 2014 also delivers tighter
integration between geological and geophysical interpretations.
"Optique" $17.5m EU oil and gas big data research
project now mid-way
"Optique", a 4 year, $17.5m oil and gas big
data research project co-ordinated by a University of Oslo professor, is now in the midway stage. The project has EUR 9.7m
funding from the European Union and is supported by DNV GL, Siemens, Statoil, and a
German cloud data company called "fluid
Operations".
Optique aims to develop a software platform
to help oil and gas users work with large and
complex data sets, using "semantic technologies", which includes data meaning as part of
the data model.
The project was initiated in 2010 by University of Oslo (UiO) professor Arild Waaler, and
launched in 2012. Total funding is Eur 13.8m
(USD 17.5m) including Eur 9.7m from the
14
EU.
Professor Waaler believes that this system
will be different to other big data solutions, in
that it will focus on understanding the complexity (including the variety) of the data,
where most other big data solutions just focus
on working with large data volumes.
"Optique .. addresses trustworthiness by
showing where data came from and how it
has changed, providing transparency for the
end user," he said.
At the moment, geologists and engineers need
to involve the IT department if they would
like to post a complex query to their databases, but with the Optique system, they can
get answers in minutes, he believes.
'This will open up new exploratory and inter-
digital energy journal - February / March 2014
active ways of working as users get more relevant data sets in shorter time."
The results of the research are planned to be
presented in Høvik, Norway, in early 2015,
with an aim to attract more companies to get
involved, and ultimately develop methods and
technology which will be used by the industry
mainstream.
'We will deliver a good concept, but this will
not be something that can be delivered to the
industry two years from now," Professor
Waaler said.
"I hope that by then [early 2015] we have
something so impressive that the industry will
want to continue to fund this project. I am optimistic."
Oil platform
–––––––
––––––––––––––––––
North Sea, Norwegian
o eg a Sector
Simply when you need
d the best.
Tested to the limits.
Ensuring ultimate relia
ability of
communications at se
ea.
We ve go
We’ve
ot
you cove
ered.
As you strive to explore more of the world’s oceans it’s important to choose the right marittime
satellite communications partner. Intellian have proved time and time again its high perform
maance
VSA
AT, Sat TV and FleetBroadband systems can be relied upon to keeep you connected to the w
woorld,
even in the most extreme locations.
Innovative designs, built to perform, engineered to last and simple to use: Intellian continuuously
produces patented technology to deliver performance leading RF capability. Seamless integra
ration
and superior remote management and control capability makke Intellian the preferred choicee for
the world’s leading service providers
v240M - The world’s first auto switching 2.4 meter antenna
system for seamless global operation on C and Ku-band.
When the bottom line relies on staying connected, trust Intellian.
intelliantech.com
Drilling and production technology
Sharecat - a faster way to make offshore databases
Sharecat is helping companies put together databases of technical information much faster, with global agreements with BP,
Shell and Statoil
data for Statoil.
Sharecat has a frame agreement with engineering company WorleyParsons, where
Sharecat will provide specialist information
management software and services to WorleyParsons to sell as part of its service to customers.
Helping you put together offshore databases faster –
Sturle Drageset, sales and marketing manager,
ShareCat
Sharecat of Norway is helping oil and gas
companies put together structured information about all of the parts and components in
their offshore operations faster, by sharing
the data.
The basic idea is that Sharecat maintains its
own databases of standard information about
millions of different parts. If you (as a customer) have a certain part on your platform,
Sharecat might already have all the necessary
specifications and materials lists (items you
might need to purchase for it). So you can
use Sharecat’s data rather than collect it
again from scratch.
In a greenfield project, all suppliers, EPC
contractors and the end client will upload
data and documents, review and collaborate
in the same environment.
Sharecat has been gathering its equipment
catalogue for over 20 years and now has millions of pieces of data, said Sturle Drageset,
sales and marketing director with ShareCat,
speaking at the Digital Energy Journal Aberdeen conference on November 25, “Doing
More with Offshore Engineering Data.
Sharecat is based in Bergen, with offices in
Aberdeen, London and Houston, and a reseller agreement with WorleyParsons in Australia.
The database can be used both in greenfield
projects, to help put together better part databases faster, and in brownfield projects, to
help operators clean their data up. It also provides associated consulting services.
Sharecat has global agreements with several
oil majors, to put together a global catalogue
of suppliers for Shell, and run a central information store for BP. It manages materials
16
The services are made available over the
cloud as Software as a Service (SaaS).
Data problems
It is too common for operators not to have
accurate records of what they have installed
on their rigs, he said.
We see inspection reports stating missing
certificates, missing history, impossible to
trace parts in the material master, and missing documentation and numbering information.
This might be expected, when you consider
that it is common for operators to take delivery of an oil rig or FPSO, but they do not receive the associated documents, part
information and materials lists, in a usable
format.
The problem can be ignored until it is time to
do modifications, but at this point, engineers
can spend 2-3 hours gathering specification
data (tags) for each component, before they
can put together a purchase order for new
materials.
Every time any information is missing, you
need to search for information and involve
colleagues, which take up hours of expensive
time. Poor information will lead to incorrect
purchase, increasing the cost even more.
On greenfield projects, it should be much
easier to gather necessary information.
But operators are often overwhelmed by the
amount of data. The operator may deal with
only a small number of contractors directly,
but each contractor will go on to send hundreds (or thousands) of purchase orders to
their suppliers and manufacturers, which
generates an enormous amount of documentation.
Sometimes documents for a single component (such as a motor) will be sent back to
digital energy journal - February / March 2015
the operator many times, because this component is used as part of many different
pieces of equipment.
The information is sent by email, which
means it easily gets lost, and no-one is sure if
it is correct.
Meanwhile, the operator’s projects staff do
not necessarily have an incentive to make
sure that the data is good, because after the
project they will move onto a new project,
and leave the data problem to the company’s
operations staff.
It can take many thousands man hours to put
together a complete parts database for a new
offshore asset, he said, so it is not a surprise
that the work is often not done, and the project is handed to operations staff with only 30
per cent of information available.
“We end up with poor and missing information in the material master,” he said.
As the industry has to lower costs, the EPC
(Engineering Procurement and Construction)
companies must work more efficient and at
the same time deliver better quality. This is
what Sharecat’s products are tailored for, Mr
Drageset said.
We also experience drilling companies to
struggle with procedures and routines to
build up information correctly, he said. A lot
of data is only entered as free text, and no
possibility to retrieve crucial information in
later maintenance, modifications and purchase of equipment and parts.
Sharecat
Sharecat’s service is to reduce the overall
workload, by maintaining and continually
updating the ‘shared catalogue’ of part data
and deliver quality data to the clients.
So for example data about a part such as a
specific ABB motor, which might be used in
thousands of different pieces equipment on
many different offshore assets, information
only needs to be entered once and re-used
many times.
Sharecat provides templates which can be
given to engineering contractors so they
know what data they need to provide, and
which can be automatically uploaded into
eEnable Your
o S
Supply Chain
Founded in 2000, OFS Portal is an organization which consists of diverse supp
plier
members who are committed to promoting eCommerce and reducing cost. We have a no
on-profit
objective to ensure we promote the bestt approaches for the industry. In addition to advvocating strong
protection for the security and confiden
ntiality of electronic data, OFS Portal has gained the trust and
confidence of the entire upstream oil and gas industry. We do this through our pro
oactive advocacy
approach toward best practices to reduc
ce costs and complexity while increasing the speed of adoption.
Our Community:
Anadarko Petroleum
Anderson Energy
Antero Resources
Apache Corporation
ARC Resources
Arrington Oil & Gas
Atinum E&P
Bahrain Petroleum Company
Baytex Energy
BHP Billiton
Bill Barrett Corp
CML Exploration
COG Operating
Common Resources
Compton Petroleum
ConocoPhillips
Consol Energy
Contango Oil & Gas
Continental Resources
Corex Resources
Crescent Point Energy
Crew Energy
Black Hills Exploration & Production DCP Midstream LP
Bonanza Creek Energy
Dee Three Exploration
Bonavista Petroleum
Denbury Resources
BP
Devon Energy
Breitburn Energy
Diamondback Energy
Cabot Oil & Gas
DistributionNow
Caelus Energy
E&B Natural Resources
Cairn India
Eagle Rock Energy
Canadian Natural Resources Eclipse Resources
Cantera Energy
Ember Resources
Cardinal Energy
Emerald Oil
Carrizo Oil & Gas
Encana
Cenovus Energy
Endurance Energy
Chesapeake Energy
Energen
Chevron
Energy XXI
Chief Oil & Gas
Enerplus
Citrus Energy
Enervest
Clayton Williams Energy
ENI
Legacy Reserves
Legado Resources
Linn Operating
Long Run Exploration
MacPherson Energy
Marathon
Matador Production
Mewbourne Oil
Petro-Hunt
PetroQuest Energy
Peyto Exploration
Pinecrest Energy
Pioneer Natural Resources
Post Rock Energy
Progress Energy
QEP Resources
Murphy
p y Exploration
p
& Production Q
Quantum Resource
NAL Resources
Quicksilver Resources
Nearburg Producing
Regent Resources
Repsol
Nexen
Resolute Natural Resources
Noble Energy
Rex Energy
Northern Blizzard Resources Rife Resources
Northstar Offffshore Group
Rio Oil & Gas
Oasis Petroleum
Rock Energy
OMERS Energy
Rosetta Resources
Highmount Exploration & Production Oxy
Sabine Oil & Gas
Hilcorp Energy
Pacesetter Directional Drilling Samson
Hunt Oil
Energy
Sandridge Energy
Huntington Energy
Parsley Energy
Shell
Husky Oil Operations
PDC Energy
Sheridan Production
Indigo Minerals
Pengrowth
Sinopec Daylight Energy
Penn Virgina
SM Energy
Jetta Operating
Perpetual Energy
Southwestern Energy
Jones Energy
Petrobank Energy &
Statoil
Laredo Petroleum
Resources
Stone Energy
Le Norman Operating
Petrobras
Sure Energy
EOG Reesources
EP Enerrgy E&P
Resources
EXCO R
obil
ExxonMobil
Fairwayss Offffshore
E&P
Fidelity Energy
E
Fieldwoood Energy
FIML Naatural Resources
Oil
Forest O
Freeport - McMoRan O&G
Gastar Exploration USA
Gear Energy
GeoSouthern Energy
Great Western O&G
Grizzly Oil Sands
Halcon Resources
Harvest Operations
Hess Corp
Surge Energy
Swift Energy
Talisman Energy
Tamarack Vaalley Energy
TAQA North
Tecpetrol (US)
Teine Energy
Total
Tourmaline Oil
Trey Resources
Trilogy Energy
Trioil Resources
Tug Hill Operating
Tundra Oil & Gas
Twin Butte Energy
Ultra Resources
Urban O&G
Vaanquard Natural Resources
Vaantage Energy
Velvet Energy
Venoco
Vermilion Energy
West Valley Energy
Whiting Petroleum
Wildhorse Resources
Windsor Energy
WPX Energy
XTO Energy
Yaates Petroleum
Our Members:
HALL
H
ALLIBU
BUR
RT
TO
ON
ON
7 % , , 3 % 26 ) # % ,4 $
Drilling and production technology
Sharecat.
Sometimes the information is already in
Sharecat’s database. If Sharecat already has
both the part information and the material
master (information about materials which
might need to be purchased to go with that
part), there is no need to enter the material
master again.
All the time, the quality of information in the
database improves, in terms of accuracy and
completeness. “We can evaluate and improve
the information in the project as we go
along,” he says.
There is no need to keep sending data back to
the vendor for checking, and enormous volumes of data can be handled.
“Doing this, you will save a lot of time and
money, you will provide correct information
to operations.”
“We capture information into templates,
process into ShareCat, point out the incorrect
information, then return it back to whoever is
Delegates at Digital Energy Journal’s November Aberdeen conference “Doing more with Offshore Engineering Data”
responsible to manage the information. All
processes are controlled by a powerful workflow tool.” he said.
Set requirements
With Sharecat, you easily know if you have
the information you need or not, and if you
don’t have it, you can inform your suppliers.
It is important to make sure the suppliers are
aware of what they will need to provide in
advance. “You have to set up all the requirements at the beginning of the project, how
that is supposed to be delivered,” he said. “If
you start asking for information after the
work has been started it is too late.”
“You can report daily, to see how to the different package suppliers progress on their deliverables.”
If you don’t keep track of the quality of the
data, you can end up with data which is as
bad as if you were just receiving it ad
hoc by e-mail.
Watch Sturle’s talk on video and download
slides at
http://www.digitalenergyjournal.com/vide
o/1245.aspx
Drilling wells - with access to other companies’ data
Mohammad Jahangir, senior drilling engineer with oil company VNG Norge, Stavanger, shared his thoughts about AGR’s
“iQx” well planning software, which gives well planners access to all public data about wells drilled on the Norwegian
continental shelf, while they plan their own wells
Q What motivated you to choose iQx as
your integrated solution in well projects?
A We saw an opportunity to save our team’s
time and resources by using a software tool
that gave our engineers immediate access to
all public drilling data from the Norwegian
Continental Shelf (NCS). We also realised
that we could improve internal skills transfer
and learn from other operators’ operational
experiences.
As iQx has the functionality to visualise data
in any format saved in public or company
databases, the engineers save a lot of time
normally spent on manual data search and
processing.
Q What are the benefits of this solution?
A Being able to combine actual well data
with location-specific experiences has allowed VNG to easier map potential challenges for the planned well and to undertake
necessary actions to mitigate risk long before the project starts.
18
iQx has also been helpful in operations enabling easy check of abnormal well behaviour with relevant offset wells and to easily
capture experiences in a new way involving
the whole team and ensuring that the quality
of these are up to the company’s standards.
With iQx we easily capture the experience
during the operation, classify it either to the
location, rig or relevant equipment, and finally approve the experience in the project
to the company’s standard so that we ensure
the “lessons learned” have value for future
operations.
Q What do you use iQx for?
A VNG uses iQx for offset analysis and saving experience. This could be for planning
decided wells, following up our partners,
making sure we are investing in the best possible relationships.
Also for applications in licensing rounds,
making sure we apply in areas where we believe we can perform well and showing the
authorities that we fully understand the technical challenges in the areas we apply for.
Experience transfer is something that we
have seen as a challenge in several companies. Experiences are either left in the project files, never to be used again, or are of a
character or quality that makes them impossible to use.
digital energy journal - February / March 2015
Q Has it yielded any benefits / eased working processes in any way?
A After installing iQx we have managed to
focus time of our engineers on engineering
instead of manual and time consuming data
gathering and processing.
We have seen that working in such a way
has increased the quality of our offset analyses giving us a better platform for decisionmaking.
Better decisions lead to safer and faster operations where we ultimately ensure a safe
working environment for our people and
higher probability of success in the projects
that we choose.
Drilling and production technology
Books: Drilling Data Vortex, where bits meet the bits
Dr Carlos Damski, CEO, Genesis Petroleum Technologies, a drilling data management company based in Perth, has
published a book "Drilling Data Vortex, where bits meet the bits" - about how to use data to get a better understanding of
your drilling operations.
your company is improving over time,
which factors in the well construction
process have the most variability in them, or
whether a certain well construction happened within a reasonably expected range or
should be considered a 'problem well'.
But what is proving much harder is actually
having the data available which you can
crunch.
Here, there are good and bad solutions, as
Dr Damski explains. If you have a computerised system which will tell a driller at the
end of a 12 hour shift that they can't file a
report because the computer does not accept
the format of a certain piece of data, you're
likely to find that your system is not getting
used. But you can appoint a dedicated data
quality control organisation, with the role of
checking data as it comes in and taking appropriate steps.
At a time when nearly every big business in
the world is getting extremely advanced in
its use of data for analysis and prediction,
the drilling industry is starting to feel left behind. Drilling is still an environment where
the person with 30 years experience tends to
win the argument, even if the data shows
there is a better way of doing it.
The data management techniques are not
particularly difficult technically - for example, you can crunch the data to show how
You don't want to throw any data away, even
the bad data, but you do want to keep a central database of data which everybody can
trust.
The results of the analysis can be enormously valuable, if it enables you to make
better financial predictions about the cost of
drilling a certain well in advance, you can
predict likely problems or areas worthy of
particularly high attention, and get a good
understanding of where your drilling team is
learning how to do it better, and where they
aren't. The drilling sector has a lot of variability and unexpected events, and lots of
scope for learning.
All of this raises an additional question who is going to implement all of this stuff?
There is still a big gap in the architecture of
professions - with drilling engineers on one
hand focussing on getting the drilling done,
and the IT department on the other extreme
often fixing problems and slow to respond to
requests to provide data. There is a growing
league of data managers but it does not seem
to be filling the gap, in drilling at least.
This book does a good job of making the
business case for more emphasis to be
placed on this middle space.
Perhaps most important of all - this book is
easy to read. So many oil and gas technical
writers seem to believe that their priority is
to be cutting edge technically, as though
they are writing in a quasi-academic world
where the most important thing is to say
something new, or which sounds new, rather
than to prioritise managing the mental load
on the reader. The industry needs more
books like this.
Available on Amazon on print ($27) and
Kindle ($23)
Events 2015
Non-seismic Geophysics
Transforming Sub-Surface interpretation
London, 19 Feb 2015
London, 13 Apr 2015
Finding Enough Oil & Gas in NW Europe
Doing more with Subsurface Data
London, 12 Mar 2015
Stavanger, 05 May 2015
Transforming Sub-Surface Interpretation
Doing more with Production Data
Aberdeen, 17 Mar 2015
Stavanger, 06 May 2015
Doing more with Subsurface Data
Finding Oil in Atlantic Basins
Aberdeen, 18 Mar 2015
London, 27 May 2015
February / March 2015 - digital energy journal
19
Drilling and production technology
Formatting data for different devices
Oil and gas employees want to access their data with many different devices with different screen sizes, with data filtered
according to their role. You can build a data structure to ensure everyone gets the data they want, for the device they use,
writes Dr Arup Ratan
Dr Arap Ratan
Ray
The Real Time Data Convergence for Multiscreen Displays (RTDCMD) framework gathers data from multiple sources scattered over
large, geographically separated areas.
This real-time data, once gathered, converges
into one central point before analytics are applied.
The data is then filtered to determine what
data is relevant to whom, before being shared
with different stakeholders in various locations. This filtration is essential as different
stakeholders will be interested in different
data points and perspectives, therefore tailoring the data that is presented speeds up the
concerned person’s understanding by limiting
the amount of irrelevant data to sift through.
Due to the sheer magnitude of an oil field, the
amount of data that could potentially be generated risks swamping the stakeholder if left
unfiltered. In contrast, applying analytics in-
stantly makes the data more manageable, enabling staff to react and take decisions more
efficiently in response to the data presented to
them.
The use of multiscreen devices that are now
popular with end users has amplified the
transformation in industrial monitoring. If we
look at how this applies to the oil and gas industry, the different stakeholders for the plant
as well as field personnel use different visualisation screens and hand-held devices. The
emerging IT and operational technology (OT)
ecosystem consists of devices, sensors, real
time data, analytic engines, always-on mobile
networks and powerful mobile applications.
Central database
The key to successfully using this ecosystem
is to accumulate the data in a large and central
database. The central database enables businesses to have access to a single source of
truth. Above this is a layer of real time analytics that distributes actionable insights to different screens for consumption based on user
roles and needs.
Benefits
Data is presented to users from a consistent
database enabling reliable collaboration.
Data is filtered and in many instances
analysed before it is sent to the end user. This
showcasing of only relevant data/alerts based
on access levels drives quicker decision-making and enhances data security. There is a reduction in data duplication and a concurrent
increase in data consistency which increases
the reliability, speed and efficiency of data
usage.
Technologies
The ability to integrate data and systems at all
three levels (machine, plant, enterprise) is reliant on the strength of technical competencies and partnerships around the RTDCMD
framework. The extensive ecosystem of partnerships required for this includes engineering and automation specialists and
Information Technology providers.
For example, a very large oil and gas major
with global operations leveraged RTDCMD
technology by creating a Collaborative Work
Environment (CWE). RTDCMD can be particularly helpful in complex environments
where safety is a critical factor. M2M platforms and applications, with RTDCMD as the
backbone, can orchestrate access controls and
authorisation levels to ensure accidents and
security lapses are eliminated.
ClampOn - subsea vibration monitoring
Subsea sensor company ClampOn of
Bergen, Norway, reports that it recently
completed a job to monitor vibration on a
subsea template at 330m water depth.
The operator had to close down production
from a subsea well due to suspected vibration and needed urgent assistance to measure
and confirm actual vibration level.
20
vessel at the docks, where all the equipment
was loaded.
Once offshore, the ClampOn vibration monitors were deployed and installed by ROV
and connected to the ClampOn PC with software topside on-board the vessel.
Within 2 days, Clampon was able to test,
prepare and ship a complete vibration measuring system, with two vibration monitors, a
specially adapted retrofit clamp, a 500 metre
reel of cable, an ROV basket and a PC with
ClampOn vibration monitoring software.
Data was gathered and sent in real time to
ClampOn's file server, from which the operator's own vibration experts downloaded the
data for further processing and analysing.
The vibration test was completed and the operator received confirmation that vibration
on the subsea module was outside acceptable
levels.
Two experienced ClampOn Service Engineers were mobilized and met the support
Clampon has services to measure subsea vibration, sand production and leakages. Other
digital energy journal - February / March 2015
typical subsea vibration jobs include vibration on large subsea flapper valves, vibration
on subsea flowlines and vibration from
chemical injections
ClampOn Subsea Vibration Monitor, rental set.
Engineering Information
AS It Should Be
Datum360 delivers Software as a Service (SaaS) and
consultancy to help Oil & Gas companies specify, capture
and manage engineering information for capital-intensive
projects and operations.
www.datum360.com
+44 3333 441 882
[email protected]
MAPPING STANDARDS:
A CORE COMPETENCY
OF EVERY GEOSCIENTIST
Maps are a canvas used to express complex situations to help support difficult decisions.
In exploring the subsurface, maps serve a number of important purposes; recording and storing information; supporting the
analysis of a range of subsurface data; and presenting and communicating information and understanding. Map creation
should be a core competency of every geoscientist, used to express complex situations to help support difficult decisions.
Our consultants can help E&P companies define and implement appropriate mapping standards that will help geoscientists
present a clear, consistent and concise suite of maps for a variety of purposes where having defined mapping standards has
enabled the geoscientists to spend more of their time focusing on the technical content.
Petrosys is a powerful subsurface mapping system that brings all your critical knowledge together on one mapping canvas,
our approach to surface modeling enables you to resolve complex challenges and to communicate geological information
necessary for decision makers to take the right action. Learn more at www.petrosys.com.au/transcend.