Foundations, Volume 3, Issue 1

Page 1

Foundations Journal of the Professional Petroleum Data Management Association

Print: ISSN 2368-7533 - Online: ISSN 2368-7541

Volume 3 | Issue 1

1st Place Foundations Photo Contest Winner; Apple & Eric Chabot, “Wild Life (Lion)”

“Necessary Evil Time” and Knowledge Workers Throwing out the baby with the bath water…is there an alternative? (Page 6)

PLUS PHOTO CONTEST: This issue’s winners and how to enter (Page 16)


Information Certainty in an uncertain E&P world

Grow Business Value Beat the Competition Protect Capital Today, more than ever, E&P companies need to manage information like a corporate asset to support the business. Critical business decisions demand confidence in the underlying information. Data quality and information certainty is the only way to gain confidence in your critical business decisions. EnergyIQ delivers the information management solutions, experience, and resources you need to manage your most important business asset: information. Call us today to find your information certainty!

www.energyiq.info | 303.790.0919

Information Certainty


Foundations Foundations: The Journal of the Professional Petroleum Data Management Association is published four times per year.

Table of Contents Volume 3 | Issue 1

COVER FEATURE

“Necessary Evil Time” and Knowledge Workers

6

Throwing out the baby with the bath water…is there an alternative? By Simon Bates GUEST EDITORIALS

CEO Trudy Curtis Senior Operations Coordinator Amanda Phillips Senior Community Development Coordinator Elise Sommer Article Contributors/Authors Mara Abel, Simon Bates, Joel Carbonera, Emile-Otto Coetzer, Jim Crompton, Trudy Curtis, Dave Fisher, Luan Fonseca Garcia, Guy Holmes, David Hood, Jess Kozman and Michel Perrin Editorial Assistance Emma Bechtel, Dave Fisher, Beci Carrington Graphics & Illustrations Jasleen Virdi

Has the Courier Arrived Yet?

4

Three Points of View

Data Management in the Business Units

About big data for oil & gas. By Jim Crompton FEATURES

Beyond the Wellhead

22

10

A high-level overview of facilities data. By Emile-Otto Coetzer

Ontologies and Data Models

18

BOARD OF DIRECTORS Chair Trevor Hicks

Directors Brian Boulmay, Trudy Curtis, Jeremy Eade, David Hood, Allan Huber, Christine Miesner, Joseph Seila, Paloma Urbano

Industry societies and their standards. By David Hood

Data Managers Look to Geothermal Reservoirs New drilling and completion techniques. By Jess Kozman

Head Office Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Email: info@ppdm.org Phone: 403-660-7817

15

Lessons learned. By Dave Fisher

Hands On With The 21 PPDM Association Board Of Directors

Treasurer Peter MacDougall

14

Recognition is not achieved quickly. By Trudy Curtis

Graphic Design

Secretary Lesley Evans

Creating a Recognized Professional Discipline

Web based delivery can be much faster. By Guy Holmes

Essential properties and data modeling. By Mara Abel, Michel Perrin, Luan Fonseca Garcia & Joel Carbonera

Vice Chair Robert Best

DEPARTMENTS

Regulatory Data Standards Work Group

24

A powerful value proposition. By PPDM Members & Staff

The Importance of Rules

25

Super heroes and shiny new technology. By Trudy Curtis

Thank You To Our Volunteers

30

Featuring Jennifer Bell & Alison Troup.

Photo Contest

16

This Issue’s winners and how YOU can get your photo on the cover of Foundations.

28

Upcoming Events, 31 Training and Certification Join PPDM at events and conferences around the world in 2016. Learn about upcoming CPDA Examination dates and Training Opportunities.

ABOUT PPDM The Professional Petroleum Data Management (PPDM) Association is a global, not-for-profit society within the petroleum industry that provides leadership for the professionalization of petroleum data management through the development and dissemination of best practices and standards, education programs, certification programs and professional development opportunities. PPDM represents and supports the needs of operating companies, regulators, software vendors, data vendors, consulting companies and data management professionals around the globe.

Foundations | Vol 3, Issue 1 | 3


Guest Editorial Has the Courier Arrived Yet? By Guy Holmes, Katalyst Data Management

S

O

almost overnight without much fanfare. Looking at a typical business use case: • Download speed is 25 Mbps (megabits per second) • T he user wants to get a file from a single 3592 tape cartridge • T he cartridge contains up to 500 GB of data. As per the below graph, access to files of up to 300 GB on a 25 Mbps connection is as fast or faster than the age old approach of courier vans. It is also: • Free in many cases to download the data, • Does not use vehicles (better for the environment), • A llows the user to download only what they need (unlike tape that is often an all or nothing approach), • In many cases can be done with added support data (observer logs, processing reports, etc.) in a single transaction, and, • You do not need to return the tape to storage when you are done, so a return courier of your tape back to storage is no longer required (again, a thumbs up for the environment). Let’s face it, there is a lot of data we need access to that is far smaller than 310 GB (Navigation, processing reports,

600

500

32 Hour Barrier

S

hear it on an almost daily basis. I usually stand for it and take it in without comment. Sometimes, if I am feeling energetic, I will try to correct the person, but usually I let it go in one ear and out the other. It is interesting how we form a belief, and then the longer we believe it, the more closed to the possibility of changing that belief we become. Today we revel in how fast technology is changing and marvel at companies that disrupt conventional businesses. If you look at the sudden change that Uber has brought to the taxi industry, you will note that the sudden disruption seems to force one to face and acknowledge the new reality. There are some businesses making slow and steady progress on mainstay technology where facing and acknowledging the change isn’t necessary. Because there is no sudden life changing shift, this slow but steady progress never forces the market to rethink what may now be possible as a result. Storing and accessing seismic data via a web based platform has been seen as impractical for many years. The reasons that I hear most often for this are “the files are too big” or “my connection is too slow” or a combination of both. Often though, I can’t help but think that the person knows neither how big their files actually are, nor how fast their connection is. What I do know is that

Data Volume in GB

I

ORD EM E P RO GR E

at some point in their life, they were probably right about it not being practical. The issue is that this is likely no longer the case or, if it is, it won’t be for long. The typical way exploration companies store and access their data is by putting data on tape and storing in offsite storage facilities. If a company wants to retrieve data from one of their tapes, they place an order for the tape to be retrieved, a courier van drives it to their office, and then once delivered, the user places it in a tape drive and reads the data off the disk for use. Often, a whole tape has to be read only to get a small file. This process in its entirety takes an average of 32 hours from ordering the data to using it. Just like believing that only a taxi (not Uber) can get you to the airport, many still believe that a courier van with a tape in it (not the internet) is the only way to get access to their data. The reality though is quite different. While waiting for 32 hours to get access to data may seem normal, the reality is that web based delivery can be or is much faster. This is all thanks to the slow and steady increase in download speeds that have been made possible around the world. In some cases, when I say slow and steady, I actually mean orders of magnitude faster

400

300

200

Data up to 310GB is faster via the Internet

100

0

4 | Journal of the Professional Petroleum Data Management Association

0

2

3

4

5

6

7

8

9

Data greater than 310GB on a 25Mbps connection is faster by Courier Van

10 11 12 13 14 15 16 17 18 19 20 21 24 25 26 27 28 29 30 31 32 33 34 35 36 37 37 38 39 40 41 42 43 44 45 46 47 48

Hours to Download


well logs, etc.) that will be many hours faster than the stock standard courier vehicle approach. On the note of facing reality, let’s also try and grapple with the fact that Mbps is going to be yesterday’s technology very soon in many cities. The leap up to Gbps will see the same download of 300 GB take 30 minutes, rather than 30 hours. Add in the use of clouddeployed processing and interpretation technology, and the need to download at all may become a thing of the past. In 2016 we consume streamed movies, rather than rent a tape. We consume streamed and packet downloaded music, rather than buy a disk. We play video games online rather than install them on our machines. As a matter of course, the consumption of data in the oil & gas industry will follow this same trend - it’s only a matter of time. Funnily enough,

When it will arrive????

even though the facts about where we are heading seem pretty obvious, adoption will be slow. In the meantime, I am sure I will still hear complaints that waiting for 32 hours to download a file is way too long, even though waiting for a courier van for 32 hours is quite acceptable.

About the Author Guy has enjoyed a successful career in the international oil and gas industry for 17 years, having held professional roles overseeing logistics, data management, information technology, data preservation and recovery.


“Necessary Evil Time” and Knowledge Workers ORD EM E P RO GR

E

S

S

O

By Simon Bates, DDC Ltd.

T

hrowing out the baby with the bath water…is there an alternative to losing expertise through job cuts? In 2015, hundreds of thousands of workers in the oil and gas industry were laid off to save money, yet in the same moment losing countless years’ worth of asset in capability and knowledge. An oilfield services (OFS) company is using a concept that may help stem the tide of job cuts and this asset loss. Several years ago they thought about how their engineers and technical sales staff, when in the office, struggled to keep up

with an important task: populating the company’s database with information from daily drilling reports (DDRs). These were in full or summarized form and gleaned from clients who were using their drill bits — a quid pro quo of the sale, providing important feedback for the OFS company to drive further sales and operational decision-making. It was clear at that time that information capture from DDRs was deadline driven — rushed and selectively done — which could not be in the company’s interests: the integrity and quality of this key

6 | Journal of the Professional Petroleum Data Management Association

dataset was being compromised. The OFS company therefore engaged a third party to provide a dedicated, centralized, Philippines-based capture team to handle the DDRs for them, so they could secure a much more complete and accurate dataset, and so that their staff’s time in offices around the world could be freed up for more creative tasks. Connectivity was made through the client’s secure, global intranet. The rationale behind this looks odd, since it adds cost, albeit little: there is still the salary of the employee and related overheads plus now the outsourcing fees. Actually, the arrangement increases productivity of the engineers and salespeople — the in-house experts — at reduced cost per unit ‘output’ from their core work. This concept is industry agnostic and labelled Knowledge Process Outsourcing (KPO), which is also an example of the economic principle of “comparative advantage”. It addresses inefficiency and supports growth. Imagine the eclectic working day of such an expert and remember that he or she both costs money and generates wealth for the company (see inset). Hour by hour, things change. At lunch, experts are all cost and no business value. Yes, lunch is important, but it does not directly result in a report, decision or outcome on which the company depends to generate wealth, only indirectly by fuelling the body and brain. The same with other breaks: the ones that nature demands, coffee machines entreat, and karma occasions. (We know that eureka moments can occur at these times, but they are primarily moments of distraction.) Experts produce outputs directly only when they are busy doing what they were employed especially to do, where they fully express their raisons d’être. Raison d’Être1 Time (or RET, another acronym in the workplace, if there’s room) is when, for example, engineers apply themselves to operational imperatives and salespeople sell. Most of such experts do not have the luxury of flat-out RET, however. Their day is compromised by tasks they may


Feature Raison d'être Lunch Comfort, karma and coffee breaks Necessary evils

Cost 40% 10% 10% 40%

Output 100% 0% 0% 0%

Hours

4 1 1 4

well regard as necessary evils during OUTSOURCING NECESSARY EVIL TIME (NET) OF EXPERTS what we could therefore call Necessary Your Average Expert's Day Evil Time (NET) — time spent reporting 140% on or preparing for RET. During NET, 120% experts have no measurable value. 100% Suppose NET and RET each comprise 80% 40% of the day, and interludes form 20%. Cost We can express the cost per unit output 60% Output in this business-as-usual scenario by 40% 100% (cost)/100% (output) = 1.0 (unity). 20% Of course, your day may look nothing 0% like this, but hold on to the principle. Raison d'être Lunch Comfort, karma and Necessary evils Part of NET involves administration, coffee breaks where the expert satisfies the needs of bosses and team staff. The other part Suppose an engineer does 100% of his or her core work in 40% of a typical day, is where the expert does research, data because the rest is spent Internal on preparation (NET) and elsewhere. CostKPO Cost Output Hours Raison d'être 50% 0% 125% 5 0.85 PLOT Output v (1) cos capture and cleansing, data retrieval and Lunch 10% 0% 0% 1 and (2) % of day outso Comfort, karma and coffee breaks 10% 0% 0% 1 organization, preliminary analysis and Necessary evils 30% 6% 0% 3 other tasks that form a necessary precursor to RET, but which do not themselves Your Average Expert's Day...With KPO tax the expert to the full extent of his 140% or her talent, nor do they contribute 120% directly to business value. When non100% administrative tasks within NET are 80% KPO Cost outsourced to a third-party knowledge 60% worker, as the operative in KPO is known, Internal Cost 40% the expert gets back additional time 20% Output to contribute more to business value, 0% and the cost of his or her unit output Raison d'être Comfort, karma Necessary evils 0% 5% 10% 15% 20% Lunch 25% 30% 35% 40% 40.0% 45.00% 50.00% 55.00% 60.00% 65.00% 70.00%and 75.00% coffee 80.00% becomes proportionately RDT lower as well. LT 10.0% 10.0% 10.0% 10.0% 10.0% 10.0% 10.0% 10.0% 10.0% BT just one10.0% 10.0% 10.0% 10.0% 10.0% 10.0% 10.0% breaks 10.0% 10.0% Suppose for example that NET 40.0% 35.00% 30.00% 25.00% 20.00% 15.00% 10.00% 5.00% 0.00% quarter of an expert’s NET (10% of daily 0% 5% 10% 15% 20% 25% 30% 35% 40% time in our case) fully supports RET. When113% a portion of 138% NET is 150% outsourced to 175% a knowledge worker, the engineer can output a Output 100% 125% 163% 188% 200% per unit output 1.0lot more 0.9 while 0.9 overall 0.8 cost 0.8 relatively 0.7 0.7 0.7 0.7 Once that portion of NET Cost is outsourced, rises little. KPO cost 0% 3% 6% 10% 15% 20% 26% 33% 40% leaving the rest still for the expert, RET Total cost 100% 103% 106% 110% 115% 120% 126% 133% 140% could then be correspondingly raised, by transfer of that 10%, from 40% to 50% KPO Benefits: Increased Output, Reduced Cost per Unit Output of daily time — a rise of 25% in RET. 1.2 250% As the expert is now producing 25% 1.0 more, the knowledge worker would 200% work for not just 10% of daily time, but 0.8 150% 12.5% (also 25% more); otherwise the 0.6 expert would not be fully supported. Output 100% Between the expert and the knowledge0.9 0.4 Cost per unit output worker, the worked time is now over 1 50% 0.2 day (112.5%), but this still represents -­‐ 0% a cost benefit per unit output, because 0% 5% 10% 15% 20% 25% 30% 35% 40% the cost per unit output has dropped to Necessary Evil Time Transferred to Knowledge Worker 112.5%/125% or 0.9. Actually, knowledge workers have an hourly cost of at most As more and more NET is outsourced, cost per unit output of the expert increases, while half that of the expert when they are the output per day could double. based at an offshore location. Additional

Foundations | Vol 3, Issue 1 | 7


Feature

cost to the business is then not simply 12.5% but 6.25%, bringing cost per unit output down further to 0.85 in our case. A knowledge worker would not work only 12.5% of a day! There are any number of experts being supported, so, with proper co-ordination, an appropriate number of full-time knowledge workers can be employed to support them. The number of knowledge workers needed is worked out according to the number and duration of tasks being outsourced across a group of experts, coupled to required service levels. Everyone’s figures will vary but the principle remains the same. Overall, it’s true that costs increase a little but the company becomes able to do comparatively much more business. This would not be possible in such an economic way by outsourcing entire roles, as this does not address the inefficiency associated with NET; it is also a minefield to lay people off only to replace them under an external contract. The knowledge worker does not want to be some clone of the expert, but aims to pick up clearly defined and documented tasks that are not best use of the expert’s faculties, yet which do require intelligence and induction into those tasks through knowledge transfer. Knowledge workers expect to be stretched to the extent they will become experts in the scope of their role. Documentation of outsourced tasks brings context and rigour, along with the opportunity to apply standards and data management best practices as part of a company’s overall strategy in this area. Importantly, it provides a clear scope to the competency required. In general, knowledge workers are people of the right caliber who can understand the context of the tasks they do while applying documented procedures and rules, and sound, industry-relevant judgements. KPO does not involve heavy funding or adverse risk to a business, because an

outsourcing vendor will contribute its own infrastructure; from a client viewpoint it’s the commitment to the outsourcing relationship that is really the key. Business managers might pause before adopting KPO for various reasons, one of which is a concern over the ability of external knowledge workers to live up to the standards of in-house experts. The example with which I began shows its potential: DDRs are not simple, transparent documents as you may know, but are littered with acronyms, units of measurement and technical information that need unpicking by people who know what they are doing, to the extent of compliance checking, and engaging in professional dialogue to clarify details with experts directly. This centralized service has run continuously since 2009, taking in well data from fifty countries worldwide — sometimes even in Spanish and Russian Cyrillic — and continues to be appreciated by the OFS client as a time and money saver as well as a dependable resource: the quality of the data now in the database is acknowledged. This is such a valuable asset in competitive times and with such pressure on good decision making. In time of need, a leap of faith proved its worth. Trust in the outsourced team led to their engagement in consultation during the most recent major system upgrade: it’s a team that’s now part of the corporate infrastructure and there for engineers globally. Experts — in the sense herein, and by extension people at all levels whose work combines NET with RET — find that knowledge workers can take up the mantle and liberate them from NET. KPO adoption is a forward-thinking concept that opens up new prospects for a company, through increase in output volume, data integrity and quality assurance, service expansion, employee satisfaction, and resilience to the cyclical pressure to simply cut jobs for savings in the short-term.

8 | Journal of the Professional Petroleum Data Management Association

About the Author Simon Bates has nearly 30 years’ management experience of offshore outsourcing projects with the Philippines. His specialties include: Knowledge Process Outsourcing (KPO) in various market verticals; service scoping; technical and procedural documentation; and, external and internal customer support.

FOOTNOTE: 1

Raison d’être – reason for being. The most important reason or purpose for someone or something’s existence

2016 Calgary Data Management Symposium, Tradeshow & AGM

October 25-26

Telus Spark Science Centre

www.ppdm.org/calgarydms2016


Are trust issues affecting your relationship with

Your Data? POSSIBLE FULL-PAGE AD SPACE

Poor-quality data costs the average company

$14.2 ANNUALLY

MILLION *

From business decisions based on bad data. From bad data replicated across multiple systems. From manual data compilation and massaging for reporting.

Bad data is bad business. EnerHub™ from Stonebridge Consulting is a cloud-based enterprise data management solution for oil & gas. EnerHub validates and syncs data across your organization, connects source systems, and delivers the information that matters to your people. EnerHub’s open architecture plugs into your existing systems and enables you to elevate data quality to a corporate competency.

Get your relationship with your data back on track. Contact us to schedule an EnerHub™ demo. Business advisory and technology solutions for next-gen oil and gas

www.sbti.com | info@sbti.com | 800.776.9755 * Gartner, The State of Data Quality: Current Practices and Evolving Trends, Dec. 2013


Beyond the Wellhead ORD EM E P RO GR

E

S

S

O

By Emile-Otto Coetzer, Chevron

I

n this introductory article, a high-level overview of Facilities Data is given. Many details are necessarily not included and the perspectives shown will vary across organizations. The author is indebted to many colleagues and friends across companies, continents and years for these perspectives.

WHAT IS A FACILITY? From the perspective of the traditional exploration data manager, the term “facility” may be defined as “any physical infrastructure downstream from the wellhead to the sales flange”. Depending on context, this may be anything from a single separator with control equipment, all the way to a full production system; from the sea floor to a refinery. Terms like “Plant”, “Asset” or

“Gathering System” are also common. Different companies, or even different engineering disciplines, vary in what they consider part of a facility, and how much detail about it is necessary. For example, not all will include utilities, civil infrastructure, significant software infrastructure and control systems in their definition of a facility. Rules about what is included (or not) are driven by regulation, corporate requirement, maintenance or contracting strategy, regional precedent and other variables. Normally, the conventions applied at a specific asset are described in a “Tagging Philosophy” or some similar title. Such a document is therefore foundational to managing facilities throughout the life cycle of each asset.

WHAT IS FACILITIES DATA? The definition of facilities data is equally varied across the energy industry. We distinguish here between the Projects (pre-operational) and Operational domains. The distinction is made because these domains generally differ about what constitutes high quality facilities data. The Projects domain, where the

10 | Journal of the Professional Petroleum Data Management Association

majority of the facilities reference data (often called the Engineering Library) is populated, tends towards an integrated definition across drawings, documents, structured data in databases and engineering models and unstructured data of various formats. Leading-edge projects generate these information types in an integrated system used for the design process itself. Progressive engineering managers in this domain will add the first sets of operational information such as construction and commissioning procedures and baselines, and even the projected maintenance and inspection programs. The Operational domain often takes a tactical perspective of the subject, driven by the urgency of day-to-day issues and continuing pressure to improve operational results. The majority of focus is on maintenance and inspection databases, control system optimization and safeguarding systems. There is often a high integration with operational processes like Hazops (Hazards and Operability Analyses), SIL (Safety Integrity Level) assessments and the like.

A HOLISTIC MODEL OF FACILITIES DATA Increasingly, databases and systems are integrated across business functions and disciplines, project phases and information formats. It is therefore necessary to view the subject in a holistic manner. A fragmented view greatly increases the likelihood of discontinuous or even contradictory database content. In a large facility, there are many dozens of software systems containing parts of facilities data. This provides an idea of how many stakeholders and variables need to be considered simultaneously. When one considers in addition the safety-critical nature of some elements of facilities data, it is clear that at least some elements of facilities data require a lot of scrutiny and maintenance. For these reasons, it is useful to present a holistic view of facilities data. One such view is shown in Figure 1. This view is highly simplified and is intended


Feature KPIs Stage Gate Data

Business Intelligence Layer

KPIs As-Built Data

Maintenance Program

Feasibility

Design Basis

Inspection Program

Process Alarm History Isolation List

Schedule

Baseline Data

Management of Change Data

Procedures Library

Engineering Information Library Project Phases

OBSERVATIONS ON THE STATE OF FACILITIES DATA TODAY

Transactional Data Layer

Operations

Static Data Layer

Decommission

Asset Life Cycle

Figure 1: A Conceptual Model of Facilities Data

INFORMATION ELEMENT

DETAIL ELEMENT

OWNER

MAINTENANCE STRATEGY

Operations VP

MAINTENANCE PROGRAM

Maintenance Manager List of Equipment

Chief Engineer

Equipment Criticality Ranking

Discipline Engineer

CMMS

Master Replicate

SIL Rating

I&C Engineer Maintenance Supervisor Discipline Engineer

Discipline Engineer

Technician

Replicate

Maintenance Procedure

Maintenance Supervisor

Replicate

FMEA

Reliability Engineer

Work Order Start Date

Maintenance Planner

DOCUMENT MANAGEMENT SYSTEM

Replicate

Master Replicate Master

Replicate Master Master Master

Master

Replicate Master

Chief Engineer

Equipment Criticality Ranking

Chief Inspector

Master

SIL Rating

I&C Engineer

Master

Inspection Task

Inspection Contractor

Inspection Frequency

Regulator

Maintenance Planner

Replicate

Master

Chief Inspector

Work Order Start Date

FMEA DATABASE

Master

List of Equipment

Minimum Wall Thickness Calculation Chief Inspector

CONDITION MONITORING DATABASE

Master

Maintenance Task

IT STRATEGY

INSPECTION DATABASE

Master

Maintenance Frequency

INSPECTION PROGRAM

ENGINEERING LIBRARY

Replicate

Master

Replicate

Master Replicate

Replicate

Replicate

Replicate

Master

Replicate

Master

Master

Replicate

CIO

Master

Data Security

IT Manager

Master

Interface Strategy

Chief Architect

Master

FD Introduction

Figure 2: An Interface Matrix of Facilities Data

as one possible view of what is in fact a very large and complex subject. The Engineering Library is shown in red. In this view, it contains all asdesigned and as-built data associated with the asset. In some contexts this may be called “Engineering Master Data”. Ideally, it is generated in an integrated format from the appraisal phases of a project and maintained in the same format for the entirety of the asset life, not least to avoid the risks associated with ETL. Ideally, quality is maintained by a strict regime of Management of Change control. Furthermore, all Project and Operational databases using any of its content do so by replication from it, rather than generating separated sources of engineering (master) data. Project information is shown in blue. Most of this data is of a temporary

nature, used during the execution of the project; most of its value diminishes rapidly after start-up. Operational domain data is shown in green. This data is generally transactional and supports the many processes specified in a corporation’s Operations Excellence or similar framework. Typical data sets could include: • Operations Control narratives with supporting alarm and Integrated Operating Window tables • Maintenance and Inspection Plans • Management of Change records. Clearly, the interface lines between these domains and classes of information are opaque at best, and considerable interface management is required. One technique to facilitate such interaction is shown in Figure 2.

The Engineering Library generally receives most attention during the final stages of the Projects domain, prior to handover to Operations. Leading projects will have agreed on an Engineering Information Handover Specification at the opening phases of the project. If this was not done, much frustration, delay and unexpected cost invariably result. Over time in the Operations domain, the Engineering Library is regrettably often neglected. Engineering drawings are often unchallenged and presumed accurate. Data types are often limited to data in the operational databases. Procedures often exist only in the form of documents. In extreme cases, misalignment exists of both database content and presumed ownership between the maintenance system and the Engineering Library. In addition, some of the contents of the Engineering Library find its way into several quasitechnical systems. In the worst scenario, the exact definition of an asset’s technical inventory is lost in the variety of systems. This situation is sometimes exacerbated by well-intended but fragmented attempts at data quality enhancement. These challenges do, unfortunately, very often lead to inefficiency, unnecessary cost and safety and production risk in operating facilities. This reality does not help the morale of technical and operational personnel. The author would venture that every engineer and technician in industry has at least one war story about wrong engineering information, and a number of high-profile investigations have recently highlighted this challenge. In periods where more exacting analysis and more surgical decisionmaking are required in the energy industry, deterioration of foundational engineering data quality will add to the pressure. Likewise, it would be very difficult indeed to justify the retropopulation of a neglected Engineering Library in periods of cost pressure. To some extent, the advent of data analysis techniques that require less

Foundations | Vol 3, Issue 1 | 11


Feature

exact data repositories has mitigated the loss of decision quality that results from neglected data systems. This author contends, nevertheless, that certain categories of engineering data, such as those related to Process Safety must not be allowed to deteriorate. Energy corporations frequently define in deep detail what it deems the minimum facilities information it requires to be accurate. Some variation exists, however consensus appears to support at least the following information types, in no specific order: • Basis of Design • Process Hazard Analyses of various forms • Process and Instrumentation Diagrams (P&ID’s) • Process Flow Diagrams (PFD’s) • Material and Heat Balances • Asset Register and associated Maintenance and Inspection Programs • Electrical Single Line Diagrams • Control Narratives • Shutdown Logic documentation • Integrated Operating Windows • Production, Budgets, Operational Performance • Safety Critical Equipment List • Pressure Regulating Valve Settings • Safety Integrity Level Assessments. The value of a quality Engineering Library is instinctive and significant. In general, the value of good engineering reference data centres around decreased safety and operational risk, reduced human error, better decision quality and internal corporate efficiency. Yet, curiously, published business cases for investing in facilities data are rare. This presents an interesting challenge for a manager that is seeking the millions of dollars it may take to re-build a neglected Engineering Library.

A PROPOSED APPROACH Imagine then a situation where Engineering and Operational leadership is faced with a very large, integrated and complex challenge to manage

foundational engineering information. As shown, the integrated nature of facilities data means that there is a real possibility that well-intentioned but fragmented efforts will work at odds and even contradict each other. Without an umbrella of oversight, role clarity and lines of accountability, these efforts lead to inefficient application of scarce resources and consequent reduction of benefit realization. The starting point of an effort to maintain or improve facilities data should therefore be the establishment of integrated governance at the executive level, to ensure a coordinated, resourced and prioritised effort. The notion of data ownership is implicit in governance. Clear governance helps ensure effective data delivery and coordination between operational and engineering teams. Also implicit is clear guidance with respect to prioritization of effort. In principle, information quality should be prioritized using the same framework as other risks to the enterprise. For the energy industry, this framework is likely to be something like: • Information related to Process and Personal Safety • Information related LicenseTo-Operate / Regulations • Information about Production Critical Equipment • Remaining information.

SOME INITIATIVES WITHIN FACILITIES DATA Over time, every engineering company, energy corporation and project modifies existing information standards and data attributes. This in turn yields a perpetual industry of data mapping, re-mapping and formatting costing hundreds of millions of dollars per year across the energy industry. In the worst examples, software update projects “archive” historical data, leading to much inefficiency for the subsequent users now having access to super-functional software but with poor data content. This problem has been recognized

12 | Journal of the Professional Petroleum Data Management Association

for decades and excellent work has been done by many industry organizations. Amongst many others, examples are an early EIHS by EPISTLE, the Pipeline Open Data Standard, ISO 15926 for the Engineering Reference Library, ISO 14224 for maintenance data and the well-known PPDM work in the well space. In recent years, organizations like Fiatech and MIMOSA have worked towards an integrated view of the subject. Several operating companies are recognizing the need for standardization and reduction of duplication of effort. The establishment of the Standards Leadership Council (SLC) in 2012 has provided an overdue platform to coordinate the very many stakeholders in this domain.

SOME THINGS FOR FACILITY MANAGERS TO CONSIDER RIGHT AWAY If these perspectives ring true for your specific asset, here are some immediate actions to begin the journey towards control of this business risk: • Determine the level of the immediate risk in the Process Safety area. This can be done by testing the asset register accuracy, formal or informal data ownership and awareness of data quality, or by testing the compliance to testing requirements of safety devices. • Raise the subject to the executive radar. One way is to test the alignment of a small sample of important data across P&IDs and the maintenance asset register. • Gather the list of software deployed at your asset that use Engineering Data and map these systems against a matrix like the one shown in Figure 2. From these early beginnings, an eventual roadmap to risk reduction could follow. About the Author Emile-Otto Coetzer P.Eng is a Senior Advisor for Asset Reliability and Integrity with Chevron Canada. He lives near Calgary, Alberta with his family and a fly fishing rod.


2016 Houston

Data Management Symposium & Tradeshow THANK YOU TO OUR SPONSORS TITANIUM SPONSOR

PLATINUM SPONSORS

www.sbti.com

www.geoLOGIC.com

GOLD SPONSORS

www.noah-consulting.com

www.energyiq.info

SILVER SPONSOR

www.theddcgroup.com

www.microsoft.com

BRONZE SPONSORS

www.sevenlakes.com

www.pdsenergy.com

www.informatica.com

www.sigmaflow.com

THANK YOU TO OUR EXHIBITORS

www.ppdm.org/houstondms2016


S&T

Standards & Technology A recognized professional discipline is created by a governed body of ethics-driven professionals who have an intentional and common purpose to develop, deploy and support a body of knowledge and professional development for the practice of data management as a professional discipline. A Professional Society is the most common framework within which this occurs.

Creating a Recognized Professional Discipline

ORD EM E P RO GR

E

S

S

O

development of regional and local communities, the creation of training and certification programs, the emergence of professional development pathways, and the growth of a body of knowledge for petroleum data managers.

By Trudy Curtis, PPDM Association

SUMMARY

E

very year, our industry spends hundreds of billions of dollars creating, collecting or buying data to support the Exploration and Production life cycle. Shared among many stakeholders and complex processes over lengthy periods of time, this valuable resource is best managed holistically as strategic asset. Even organizations who have successfully created data management as a corporate discipline recognize that the majority of data used within any company comes from sources outside their area of control. It is only through collective, industry driven developed and adopted data management expectations that we will fully achieve our objectives. Since 1991, the PPDM Association has developed standards and best practices for data management. In 2007, the PPDM membership expanded its remit to support the professionalization of data management as a recognized discipline for industry. Many components are necessary for this to happen, including the

The objective of data management is to create an environment where trusted data remains available and accessible to all stakeholders through the E&P life cycle without being lost or corrupted.

INDUSTRY’S STRATEGIC VISION Professional recognition is not achieved quickly. The illustration below shows the progressive development plan of the PPDM Association; our programs have been grounded in these strategies for a decade. The iterative PPDM approach is multifaceted, with each element in the strategy providing the framework within which the next elements are grounded. 1. Create the Community of Practice: Every discipline starts with the growth of an intentional and purposeful community of data managers who build personal, professional and technical relationships with each other. a. Communities are built regionally, each with a strong and committed leadership team. b. Communications are fostered through technical publications, social

media and professional journals. c. Relationships between service providers and service consumers are fostered in a neutral collaborative environment. d. Leadership is provided by a strategic board of directors who guide the community toward success. 2. Inward Facing Initiatives Drive out the Body of Knowledge, Standards, and Best Practices: Subject Matter Experts (SME) in the community of practice are recruited to identify or create a family of products that support and sustain a foundation of practice for data management that is the core of a recognized professional discipline. The existence of a professional discipline is predicated on the existence and use of these materials as appropriate. a. Best practices: This is foundational knowledge for a data manager in the practice of their profession. What is a Well, and the Business Rules Repository are good examples of foundational best practices. b. Standard Specifications: These

Professional Recognition Professional Development

Training & Education

HR support Careers Salaries

Certification & Maintenance

Professional Governance

Professional Discipline Body of Knowledge & Best practices

Disambiguation, Best Practices

Standards & Specifications

Professional Expectations

Organizational Governance

Community Community of Practice

14 | Journal of the Professional Petroleum Data Management Association

Local and Regional Events

Leadership Teams

Publications & Communication


Standards & Technology The Data Manager’s objectives must both support and supersede the individual needs of each stakeholder group or process, to ensure that trusted and appropriate data is stewarded for all stakeholders in the support of corporate and industry goals.

specifications are useful tools that can be used by a data manager to fulfill certain functions to ensure that data is interoperable, accessible and available to all stakeholders. The PPDM Data Model is an example of a Standard Specification. Members of the Standards Leadership Council also develop and promote standard specifications that are useful. c. P rofessional Expectations: In the practice of any profession, it is necessary to determine what constitutes appropriate expectations. These expectations support the necessary trusted relationship between data managers and their stakeholder customers. 3. Outward Verification of the

Data Manager’s Portable Skills and Knowledge. Recognition of a trusted and useful professional discipline is grounded on a clear validation that practitioners understand and follow industry best practices; and that training, education and qualification opportunities are available to (and used by) practitioners. a. Training and Education: Industry training programs, along with post-secondary programs, include data management elements that are aligned with industry expectations in their curricula. b. Certification and Professional Development: Certification programs validate the data managers’ skills and knowledge and enforce

S&T

the expectation for continuous professional development. c. HR support: Standardized job descriptions, salary surveys, career ladders and other support materials help Human Resources build and maintain a competitive environment for data managers. If you are interested in learning more about these programs, or would like to join the community of hard working professionals who are driving this out, please contact us. The PPDM Association is your community for professional recognition. About the Author Trudy Curtis is the CEO of the PPDM Association.

Data Management in the Business Units By Dave Fisher, PPDM Association

PROCESS

A

t the PPDM Calgary Data Management Symposium in 2014, everyone gathered in table groups to consider ten things about data types. Each table was given a subject area (well logs, seismic operations, etc.) and answered this question: “What 10 things should a data manager pass along to a person newly assigned to this subject area?” The assumption is that the incoming data manager knows the principles and practice of data management but is unfamiliar with the unique data types in this subject. A transcription of all the ideas is on the Past Events web page at www.ppdm.org under Events. These discussions seem to reveal two common themes.

KNOW THE DATA

HANDLE THE DATA

To meet the needs of the stakeholders and maintain the quality of the data, the DM must have sufficient understanding of the data and how it is important to the business. • Understand the data’s value to the business • Maintain communication with the stakeholders (internal and external) • Be familiar with the essential technical aspects of the data • Understand the life cycle of the data • Appreciate the variations related to time, geographic area, regulations, business unit needs.

In addition to data management principles and skills, some important matters are specific to each data type. • Workflows – source / load / QC / deliver / archive • Compliance – data governance, internal/external regulations, customer expectations • Data quality rules • Systems of record and archive • Responsibilities, accountabilities, metrics and reporting.

Foundations | Vol 3, Issue 1 | 15


Photo contest

Foundations photo contest

“FATHER AND SON CANOE TRIP, SQUARETOP MOUNTAIN, UPPER GREEN RIVER LAKE, BRIDGER WILDERNESS, WYOMING” BY RAYMOND OBUCH 2nd Place in the Volume 3, Issue 1 Foundations Photo Contest “A few College friends and their sons got together for a 4 night trip to Upper Green River Lake, Bridger Wilderness Wyoming. Green River Lakes form the headwaters of Green River which eventually makes its way to the Colorado River.” – 2004

16 | Journal of the Professional Petroleum Data Management Association


Photo contest

On the cover:

“WILD LIFE (LION)” BY APPLE & ERIC CHABOT 1st Place in the Volume 3, Issue 1 Foundations Photo Contest

“Wild Life and Nature Capture” – 2015

Enter your favourite photos online at photocontest.ppdm.org for a chance to be featured on the cover of our next issue of Foundations!

Foundations | Vol 3, Issue 1 | 17


S&T

Standards & Technology

central role.

ONTOLOGY DEFINITION

ORD EM E P RO GR

ORD EM E P RO GR

E

S

S

O

E

S

S

O

Ontologies and data models By Mara Abel, UFRGS, Michel Perrin, Geosiris, Luan Fonseca Garcia and Joel Carbonera, (UFRGS, PhD Students)

ESSENTIAL PROPERTIES AND DATA MODELING FOR PETROLEUM EXPLORATION

P

etroleum exploration and production rest on reservoir models that integrate a large set of data of various kinds. The common backbone of this data is the object of the modeling itself: the reservoir and the geological properties attached to it. Each category of professionals involved in the reservoir study views this reality according to some specific field of knowledge. These specialists thus generate various sets of data, each resting on a different conceptualization of one same object: the petroleum prospect. The resulting data models can be efficient in attending a particular application, but they are hardly interoperable and thus difficult to use in federate software environments. In view of this situation, petroleum exploration appears to be a domain rich in challenges related to conceptual modeling and data integration, in which ontologies can play a

Ontology is a branch of Philosophy that studies the meaning of existent beings. In Computer Science, the term ontology has been used to designate an artifact (a file, a description, a representation) that formally describes, in a computer language, a set of concepts, whose meaning is shared by a community of practitioners. Significant progress was made in the field of ontology studies in the late 90’s, when Nicola Guarino analyzed the various meanings in which the word ontology was being used (Guarino 1998). He insisted on the idea that Ontology is, primarily, a logical theory accounting for the intended meaning of the formal vocabulary utilized by a community for naming representations in its domain. Guarino borrowed from Philosophy a few meta-properties such as identity, unity, rigidity, and dependence (Guarino & Welty 2000), which greatly help to clarify the meaning of the concepts that are currently expressed by means of domain ontologies in the various fields. We intend to demonstrate here, by a gentle introduction of two of these metaproperties – rigidity and dependence - that analyzing information through the view of ontological metaproperties, as proposed by Guarino, can be helpful for reducing both the complexity and the ambiguity of data models.

THE USE OF ONTOLOGICAL METAPROPERTIES IN MODELING The first useful ontological notion is essence. According to (Guarino & Welty 2004), a property attached to an entity is essential to this entity if it must hold for it in every possible world. For example, being crystalline is an essential property for a mineral but it is not for a gemstone, since we can produce gemstone from noncrystalline material, like amber. When a property is essential for every instance that can exhibit it, we say that this property is rigid. The notion of property, in Philosophy, refers to every predicate

18 | Journal of the Professional Petroleum Data Management Association

that can be applied to a given individual (or instance), like “being a horse”, “being a mineral” or “having a brain”. In our example, “being crystalline” is essential for minerals, but not for other substance, like glass, so it is not a rigid property. Considering another example, a human being is an instance of the concept person and a human being is a person along all his life (and even after). Then the quality of “being a person” is rigid since there is no instance of human being that can stop being a person. Conversely, being a student is not a rigid property, since someone can stop being a student without stopping existing. A piece of mineral cannot stop being a mineral, but an entity which we consider being a gemstone, has not been a gemstone all along its existence since it was not one before having been cut and polished in order to be used in jewelry. Student and gemstone are defined by antirigid properties that define roles for other entities, like persons or mineral pieces. The notions of essence and rigidity help in identifying the concepts in the domain that provide the identity to individuals and can be tracked in the models. It thus allows one to identify vocabulary practices that may cause ambiguity like denominating instances of a domain according to antirigid properties and building models over anti-rigid concepts. For example, naming a person as a “client”, a geographic area as a “prospect”, a geological unit as an “economic target” hardly help in producing long term integrable models. In the field of data models, considering essential properties allows one to correctly identify entities and to produce a more precise representation, which facilitates further integration and interoperability. We will analyze here a simple example related to petroleum exploration: the modeling of the entity reservoir1. In the context of petroleum exploration, a reservoir is a volume inside a prospect, which may contain petroleum and water. For modeling it, we must examine whether the property of “being a reservoir” is rigid or not. In other words, we should decide whether some entity called “reservoir”


Standards & Technology

may stop being a reservoir and still exist. The answer strongly depends on the modeler’s conceptualization of a reservoir. Some geologists may simply define a reservoir as a portion of rock having high porosity/permeability. This definition is rooted in some intrinsic properties of the entity (porosity and permeability) that cannot be lost 2. In this case, “being a reservoir” is a rigid property. This first conceptualization will produce the model showed in the Figure 1(a). Reservoir

Reservoir

Fluid

Porosity Permeability

Porosity Permeability

Chemical Composition

(a)

(b)

Figure 1: Alternative models for the entity Reservoir based on (a) intrinsic essential properties or (b) external dependence. However, some other geologists will consider that a portion of rock with high porosity and permeability is not a reservoir until its voids actually contain petroleum or water. This second definition implies that an instance will stop being a reservoir if it stops having water or petroleum inside its empty voids. If a reservoir is exposed to air, it will lose its content of petroleum or water, but the volume of rock to which it corresponds will not disappear. But, according to our second model definition, it will stop being a reservoir. The property of being a reservoir in this second model is anti-rigid. It is just a role of some existent portion of rock that should be considered as entity of another concept, such as Rock body, and modeled in this way in the data model. As shown in Figure 1(b), this second model requires the modeling of a second entity, petroleum or water, which specifies the relational dependence that affects the instance of the reservoir that we consider. Any instance of an anti-rigid rigid concept has a relational dependence on some instance of another concept. It can exist only if the relationship exists. For example, a “student” cannot be a student if there does not exist some school or university in which he/she is registered.

In our second model, an instance of reservoir cannot exist if there does exist a fluid (water or petroleum) inside its voids. Deciding what is the rigid entity that provides identity to the several roles that an instance can assume is a central task in producing precise and efficient data models. The taxonomic (or hierarchical) structures that are defined, determine the subsumption relations that can be established between the various entities. Entities defined by anti-rigid properties cannot subsume entities (i.e., it cannot " be the super class of entities) defined by rigid ones (Guizzardi & Wagner 2005). Let us consider the schema shown in Figure 2, which intends to model the variety of reservoirs that are explored in a petroleum company.

Reservoir

Sandstone

Fractured schist

Figure 2: Wrong use of the subsume relation. The model shown in Figure 2 is wrong because the class Reservoir cannot subsume the sub-classes Sandstone and Fractured schist. According to the schema of Figure 2, the reason is that the extensions (instances) of Sandstone and Fractured schist should be also extensions of Reservoir but this is not right since these rocks do not always constitute reservoirs. According to the design pattern proposed Guizzardi in [Guizzardi & Wagner 2005] for dealing with such cases, we propose a better model on the schema of Figure 3. In this schema, the entities marked in grey are defined by rigid properties.

ADVANTAGES OF ONTOLOGICAL ANALYSIS Ontological choices are not only an academic issue related to different modeling options. These choices have

Reservoir Porosity Permeability

existential dependance

Sandstone

Schist

Composition Texture

Composition Texture

Sandstone Reservoir Porosity Permeability

S&T Liquid Fluid Density Viscosity

Petroleum

Water

Fractured schist reservoir Porosity Permeability

Figure 3: Conceptual modeling based on ontology properties. practical consequences for the model usage and data consultation. In the example, the option of considering reservoir dependent of the entity fluid allows to create instances of fluid types or occurrences and associate them to a particular instance of reservoirs. The first modeling option doesn´t allow this usage. Moreover, the model ambiguity can be reduced when the meaning of the represented entities is made explicit. This avoids that the same vocabulary be used to refer to two or more concepts that modelers or users consider being distinct. We additionally claim that providing a common framework based on essential entities allows reducing the number of entities and complexity of the resulting model. Other ontological metaproperties require a better analysis in conceptual modeling activity. Especially in Petroleum Geology, properties like identity and unity can help in defining what exactly are the entities of reality that are being modeled in the database and also provide a good support to integrate models in the several scales of analysis (microscopic, well, reservoir, basin scales) into the petroleum chain. These metaproperties will be the subject of a further discussion. About the Authors Mara Abel is a geologist and doctor in Computer Science. She is Associated Professor in Informatics Institute of Federal University of Rio Grande do Sul (UFRGS) and co-founder of ENDEEPER Knowledge System Company. Michel Perrin is a retired Professor from Ecole des Mines de Paris and the Scientific Advisor of the Geosiris company.

Foundations | Vol 3, Issue 1 | 19


S&T

Standards & Technology

Break free from Oracle Now is the time to move your Oracle databases to SQL Server with free licenses.* Get the one data solution that has everything you need—mission-critical performance, business intelligence, and advanced analytics—all built in. As the new industry leader in data, SQL Server has no equal. Learn more about the offer at www.microsoft.com/breakfree

*Software Assurance subscription required

(About the Authors, Cont’d) Luan Fonseca Garcia is a Doctoral Student at the Computing PG Program of Informatics Institute of Federal University of Rio Grande do Sul in Brasil.

REFERENCES Abel M., Perrin M. & Carbonera J. (2015). Ontological

Guarino N. & Welty C.A. (2004). “An overview of

analysis for information integration in geomodeling.

OntoClean”. In: Handbook of Ontologies (eds. Staab S

Earth Science Informatics, 8, 21-36. Springer.

& Studer R). pp. 151-171. Springer

Guarino, N., ed. (1998), Formal Ontology in

Joel Luis Carbonera is a Ph.D. candidate in the Computing PG Program of Informatics Institute of Federal University of Rio Grande do Sul (UFRGS), Brazil. .

Information Systems. Proceedings of FOIS’98, Trento,

Guizzardi G. & Wagner G. (2005). Some applications

Italy, 6-8 June 1998, 3-15. IOS Press,

of a unified foundational ontology in business modeling. Ontologies and Business Systems Analysis,

Guarino, N., ed. (1998), Formal Ontology in Information Systems. Proceedings of FOIS’98, Trento, Italy, 6-8 June 1998. Amsterdam, IOS Press, pp. 3-15.

FOOTNOTES

Guarino N. & Welty C. (2000). “A Formal Ontology

1

A n extensive analysis of the ontological properties of

of Properties”. In: The ECAI-2000 Workshop on

geological entities can be found in [Abel et al. 2015].

Applications of Ontologies and Problem-Solving

We are considering here the time of exploration, not

Methods. IOS Press

2

transformations over geological time.

20 | Journal of the Professional Petroleum Data Management Association

345-367. IGI Global.


Hands On With The PPDM Association Board Of Directors ORD EM E P RO GR

E

S

S

O

By David Hood, geoLOGIC systems ltd

M

y customers ask me why geoLOGIC has made such an intentional and strong commitment to industry societies and their standards. The answer is quite simple. Behind the scenes, standards make business work better. This is true in our everyday lives, where standards allow us to use our cell phones, or even ATMs, anywhere in the world. Industry standards work the same way; they ensure that our processes are fit for purpose across industry. They keep corporate data and information assets accessible and useful for many different stakeholders through our long and complex E&P life cycles. geoLOGIC is a data centric company. We provide our clients with oil and gas data and the tools they need to make data useful and almost every data element that we supply has been `value-added’. Our focus on data explains why the Professional Petroleum Data Management (PPDM) Association is important to us. Data is the life blood of any organization, and our corporate objective is to support that need for high quality, integrated data through the life cycle. We recognize that a lot of the data that industry uses comes from many variable outside sources. The consequence is data that has different

structures, variable content, dubious quality or completeness, and unclear terminology. With data standards, we can work to resolve these problems together. geoLOGIC joined the PPDM Association early on, and in 2005, I joined the PPDM Board of Directors. Over the last 11 years, I have served in several capacities, including Chairman of the Board. Our expert staff members have participated in every workgroup PPDM has conducted for a decade, and we have sponsored and participated in and exhibited at PPDM events as Titanium Sponsor since 2011. Because of this, we have seen firsthand that the spirit of cooperation and community in PPDM aligns well with our corporate philosophy. Our commitment to PPDM is based on the results that this powerful community has realized in a few short years. PPDM’s collaborative work group process has resulted in standards that are not just developed; they are put to practical use all over industry, including geoLOGIC’s products and services. PPDM community building through events and the Foundations journal allows all of us to share ideas across the globe; we are proud to be part of this work. The PPDM approach fits with our own corporate culture of working with others to ensure

that customers get the best possible results. The successful transition of PPDM from a pure standards organization to a professional society committed to data stewardship and the professionalization of data managers is a logical move that geoLOGIC fully supports as an important strategic move for our industry. We believe that our industry has only just begun to benefit from the value of data standards and the recognition of data management as a professional discipline; we fully intend to continue to work at all levels to achieve these important goals. Today is a very difficult time to be working in the hydrocarbon industry. We are confronted by a seemingly endless series of problems that appear to compound daily into an existential crisis. In these circumstances it is easy to lose sight of the fact that the industry has a record of overcoming numerous supposedly ‘insurmountable’ challenges in the past. (Anyone believe in ‘Peak Oil’ anymore?) The hard work and innovation of the people in this industry will find new paths to success. The temptation to cut costs ‘across the board’ is great and entirely understandable but we must continue to prepare for the future, for there will be one. The costs of standards are relatively small in the overall scheme of things but short term budget cuts can do long-term damage to the momentum which organizations like PPDM need to build to be successful. It is for that reason that geoLOGIC has made a commitment to maintain our contributions to professional and standards bodies in our industry at precrisis levels. They need our support at $30 per barrel much more than they did at $110. I hope you will consider doing the same. About the Author David Hood, President and CEO of geoLOGIC systems since 1997 and has been a PPDM Board member since 2005.

Foundations | Vol 3, Issue 1 | 21


Guest Editorial Three Points of View about Big Data for Oil & Gas By Jim Crompton, Reflections Data Consulting & Noah Consulting

a vendor position their sales pitch if they understood that Big Data is being used for three different reasons (at least). It would also help them pitch a technology and service solution that fits the unique requirements of each audience.

THE IT FUNCTION

B

ig Data is all the rage. Every company has to have a Big Data strategy (or be working on one) and every supplier has to have a Big Data line of products and services (or rebrand an existing product under this banner). You would be surprised how many times the word Hadoop is mentioned in a business conversation, but you probably would not be surprised that many people that use the term have no real idea what they are talking about. Turn the conversation to ‘structure on query’ versus ‘structure at load’ and you will get a room full of blank stares (unless you are in a room full of techies). I thought I would try to describe three different ‘points of view’ about Big Data that coexist in a large company. Each perspective is right and adds value to its audience, but each perspective has a different focus and the goals are not the same. Imagine, if you will, the blind men and the elephant story. If Big Data is the elephant, each blind man is seeing a very different attribute. If one would assume that each audience is speaking about the same thing, they would be very confused and disappointed. It might help ORD EM E P RO GR

E

S

S

O

For your IT function, Big Data means large and growing volumes of data that have to be managed and processed to gain insight into the impact of all this has on their computing and communications infrastructures. They look at the growing volumes of data and the different types of data, and translate the Big Data challenge into the number of storage devices they have to buy and the amount of bandwidth they have to provide to accommodate Big Data. When you add cloud computing to this story, for IT is means IaaS (Infrastructure-as-a-Service) and how they could provision for Big Data by renting servers and deploying data, storage, and computing resource virtualization instead of trying to keep up with procurement of physical devices in their data centers. Big Data is a budgeting and provisioning problem from this perspective. Try selling data appliances and unified storage solutions. Don’t bother them with questions about data and decision quality. This is a bottoms-up point of view. To this blind man who is touching the leg of the elephant, he sees a tree.

SENIOR MANAGEMENT’S PERSPECTIVE Senior management looks at Big Data and sees KPI and performance metrics scorecards and dashboards with the purpose to gain better insight on the performance of their investments and the

22 | Journal of the Professional Petroleum Data Management Association

BIG DATA TECHNOLOGY: Hadoop distributed file management, MapReduce, Spark, NoSql, federated search, semantic understanding of data, ontologies, graph databases, ELT, R, service oriented architecture, data appliances, real-time streaming analytics, in-memory analytics algorithms, statistics and physics based simulators, etc. health of cash flow. They are the captains of the financial ship looking at the control panels of their organizations. Big Data means the ability to get a more updated view on how things are going, improving the velocity and responsiveness of their ability to control the ship. Moving from month-end close data to daily sales figures is a transformation. Seeing early results from new investments allows them to act and intervene much quicker. Try selling them scorecards and analytics (but only the high level type). Don’t bother them with the information plumbing on how to get the real-time performance metrics as this is a top-down point of view. Some analytically inclined managers (those that tend to the micro-manager end of the scale) may inquire about drill down capability so be ready for that question with some kind of federated search tool. To this blind man who is touching the elephant’s trunk, he sees a snake.

THE BUSINESS ANALYSTS & DIGITAL ENGINEERS My favorite audience is the Business Analysts and Digital Engineers. They live in the world of operations IT, not the traditional IT. While positioned in


Business Functions

Data Consumers Lifecycle Money Simulation Data Flow

0

00 10 00 11 0

110 10 0

Higher Tolerance

Enterprise data warehouses, operational data warehouses, data marts, ETL, ERP transactional platforms, SQL, RDMS, historians, process control systems, business intelligence reporting tools, taxonomies, OLAP, OLTP.

Big Data Budgeting Business Process Performance

Modeling

Technology Conventional

Data Virtualization

01 01

the middle of the organizational chart, they actually have a broader systems view of Big Data. They span from field instrumentation, to control systems, to historians, to modeling and simulations applications. They do tend to be more data consumers than good data managers so their world has far too many overstuffed spreadsheets and visual basic routines that once took only a few minutes to run but now take hours-to-days due to the higher volumes involved. Their goal is trying to get operational insight into assets to make more money, to operate more efficiently and safely. These folks are more technical, but they tend to like the detail, unlike their senior managers. They may have a naĂŻve view of the IT infrastructure but a better view of the business process. You can sell these folks almost anything if

CONVENTIONAL TECHNOLOGY:

10010 10

they have the budget authority. They love new toys and have a higher tolerance for the rough edges of new technology. In tough budget times, you can have a very interesting conversation with them but end up with no sales. To this partiallysighted individual, he touches the body of the elephant and can almost make out the total animal but ends up missing a few critical details. He sees a living wall. So, it is often an external perspective that sees the total elephant and the full system architecture. There is a role for external consultants after all. Big Data is large volumes and variety requiring a different kind of infrastructure and very different information plumbing (service oriented, data virtualization, etc.). Big Data is about velocity and the ability to see the organization moving and reacting to the market but the integration of the parts must be designed to follow the anticipated business process and make sure that quality data drives quality decisions. Big Data is about understanding the lifecycle of the data flow from sensor to model and back again and it requires a careful discipline design of the total system and the full lifecycle of the asset. Big Data is about all of the above.

In other words, this is really hard to get right. Buying a new kit for any of the components doesn’t ensure a better system. Technology advances are ripe to really upgrade the capability of operations and business functions. Big Data technology is really cool and potentially transformational. But, before you throw out the more conventional technology make sure you see the full picture, the whole elephant, or your Big Data investments will miss the mark on achieving the full potential of this technology revolution. About the Author Jim retired from Chevron in 2013 after almost 37 years with a major international oil and gas company. After retiring, Jim established Reflections Data Consulting LLC to continue his work in the area of data management, standards and analytics for the exploration and production industry.

Foundations | Vol 3, Issue 1 | 23


S&T

Standards & Technology

NEWS

Regulatory Data Standards Work Group by PPDM Members & Staff

F

or many years, the Professional Petroleum Data Management (PPDM) Association, the international not-for-profit professional society for data managers in the oil and gas industry, has worked collectively with industry to develop and publish standards that help industry and regulators work together more efficiently. Semantic standards, such as What is a Well?, mapping standards such as Well Status and Classification, identification standards including Well Identification Standards, and the PPDM Data Model have been referenced and used by operators, regulators and service companies for 25 years. Building on this powerful foundation, the PPDM Association is establishing a Regulatory Data Standards work group so that the PPDM collective approach can develop and deploy additional standards and best practices.

REGULATORS BENEFIT FROM INDUSTRY STANDARDS In a mid-2015 letter to the PPDM Association, Jim Ellis, President and CEO of the Alberta Energy Regulator, said “Interaction and collaboration between regulators is increasing on an international scale. Common language and data standards are vital for consistent information reporting, analysis, decision-making, and interpretation. The initiative will

improve regulator’s ability to communicate with other regulators, operators, and third party vendors. Data standards will help enable transparent, consistent, and trusted information. Regulatory data standards will reduce complexity for operators and vendors both nationally and internationally.” Today’s legacy of legislative and technical regulatory architecture was developed specifically to fit the needs and purposes of each region; in many cases, rules of primacy determine where regulatory authority resides. This practice allows each regulator to develop legislation tuned to the specific social, economic and environmental needs of their constituency. Typically, a governmental authority distributes the necessary functions of approving and overseeing the complex E&P life cycle processes among one or more agencies who develop a series of technical systems for receiving, processing and managing the information that moves between themselves and industry or the public. Over time, the cost and effort required for each agency to develop the necessary regulations, guides, forms, procedures and administrative infrastructure necessary to manage these processes becomes difficult and expensive. At the same time, operators must address the complexities of managing applications and compliance processes for every agency in each government under whose authority they do business. Developing policies and procedures to conduct operations in each region, training staff and developing the systems to respond appropriately to each regulator is costly and time consuming for industry. Industry and Regulators around the world are facing the same issues. They must answer to their stakeholders in the areas of: water quality, waste management, air quality, public safety and transparency. Every Operator and Regulator must increase their efficiency and effectiveness and develop mechanisms to

24 | Journal of the Professional Petroleum Data Management Association

PPDM Regulatory Data Standards Work Group PPDM has launched a multiparty work group comprised of representatives from international Regulators and Operators. We’ll come to consensus on what it means for data to be measurably complete, consistent and cohesive. We’ll develop the glossary (vocabulary) to serve as a Rosetta Stone that will disambiguate key terms and phrases such as: Well, Log, and Completion. We’ll develop a model for storing or mapping existing information stores for the purpose of sharing information. enable transparent access to information.

PPDM PAVES THE WAY WITH A REGULATORY DATA STANDARDS WORK GROUP In 2015, the PPDM Association conducted a series of interviews with regulators, operators and data vendors to identify opportunities where industry can benefit from collectively developed data standards. Several opportunities emerged. Disambiguation Semantics: Variable vocabularies and dialects are used by all stakeholders, resulting in unclear communication, misinterpretation and errors. The impact to process efficiency and reliability can be high. “Rosetta Stone” vocabularies such as the PPDM What is a Well can help disambiguate these dialects. Quality and Completeness Rules: Expectations about “good” or “complete” data are extremely variable, so data submitted to regulators is often not fit for purpose by regulatory agencies or industry users. Industry-agreed data quality and completeness expectations, such as the PPDM Data


Standards & Technology

S&T

Standards allow industries like Automotive, Telecommunications and Airlines to function globally. Whether you are making a phone call, booking a flight or buying a part for your European car, you are able to do so courtesy of industry adopted standards. Rules initiative, will be expanded to develop a collective industry consensus about appropriate data creation and submission. Data Persistence Standards: Since 1990, the PPDM Association has collaborated with industry to develop structures for data management that are widely used by operators and data vendors today. Expanding the scope of these structures to encompass a complete suite of regulatory data will enable simpler and more efficient transfers of information between stakeholders. In combination with data standards, the PPDM Petroleum Data Analyst (CPDA)

certification program helps build trust between regulators and operators by verifying that both parties have the necessary foundational knowledge and skills to collaborate effectively.

A POWERFUL VALUE PROPOSITION Regulators benefit from a common key vocabulary, industry developed and accepted expectations for “good data” submissions, and ever-growing relationships, built on trust, that are grounded in knowledge and skills. Operators leverage industry standards to promote efficiency, safety

and profitability. Industry standards help service companies and consultants position their products and services for success Whether operator or regulator, custom built data and software systems are difficult, time consuming and expensive to develop and maintain. Our industry is building a strong community that supports data through the E&P life cycle. We do this with standards and best practices, such as those created by PPDM work groups, and by validating the skills and knowledge through the Petroleum Data Analyst certification program.

Data and the Tech Hype Cycle Peak of Inflated Expectations

In most technology projects, data is the “red headed step child” that tags along with the project as far as is “in scope”.

Visibility

Once a technology product is decommissioned, what happens to the data?

Plateau of Productivity Trough of Disillusionment

THE IMPORTANCE OF RULES

Technology Trigger ORD EM E P RO GR

E

S

S

Maturity

O

By Trudy Curtis, PPDM Association What do super heroes and shiny new technology have in common? They both promise to save us. Our industry has seen a lot of shiny new technologies emerge, each promising to maximize productivity, find oil and optimize productivity. It’s all about implementing the right kind of database, or search engine, or document management tool, or decision support product or visualization software. Like the red-headed step child, data that goes into the engine is ‘neglected’; it is often loaded with minimal attention paid to data quality, completeness or even fitness for purpose. Instead, the focus is on the technology, leaving data issues “out of scope” or relegated to an imaginary “future phase” of the project.

Data ends up here

Copyright 2016, PPDM Association. All rights reserved

Modeled after Hype Cycle for Emerging Technologies, 2015, by Gartner http://www.gartner.com/newsroom/id/3114217

Figure 1 Somehow, every new technology convinces us that we can dodge the fundamental precept “Garbage In, Garbage Out”. Ever hopeful, we believe that the right technology will overcome the consequence of years of neglectful data management. Gartner’s five stage Technology Hype Cycle (Figure 1) does a great job of describing the exciting (and expensive) love affair we have with new technology as we ride the roller coaster to the top of the Peak of Inflated Expectations. From the summit, we discover that our shiny new toy is made of lead, not silver, and will require a lot of time and effort we did not anticipate. Fundamental problems with unclear semantics, poor data quality

and completeness, and lack of systems integration frustrate implementation teams and bewilder executives. Again and again, we put garbage data in one end and expect gold to come out the other end. Discouraged, we plummet to the Trough of Disillusionment. Data stewardship is not about preparing data for a particular function, or software product. It’s about stewarding data as a corporate asset, for the corporate good, through the E&P life cycle. That means that our data strategies have to be designed to support the tactical needs of each immediate user group, or technology implementation, in alignment with the strategic preparedness needed Foundations | Vol 3, Issue 1 | 25


S&T

Standards & Technology

There’s an old proverb about a poor widow who lost her most precious possession. She found it, of course, by cleaning and organizing her house from top to bottom; she got out her broom and kept at it until she found what she wanted. The simple truth is that finding treasure is usually about hard work, and getting the basics in order.

by the organization as a whole. Data stewardship takes hard, determined, consistent and sometimes thankless work. But once we have established good governance, cleaned up our data and put it away properly, it is ready to be the foundation for the best, shiniest new technology coming at us. Instead of a disastrous plummet down the Trough of Disillusionment, we may swiftly find the Slopes of Enlightenment and ascend to the Plateau of Productivity, from which we may find at least some of the promised riches. That’s not easy. We have a history of project based tactics that try to make data fit for the “immediate” purpose of a stakeholder group, usually as part of a software project. Superseding these tactical needs in favor of strategically positioning data to support the needs of all is not an approach we are used to. It’s hard, and it takes time, and it requires vision and dedication. It can be done. Broadening our vision means ensuring that our data is strategically created and stewarded from inception, and sustained through many software products and stakeholder processes, sometimes for decades. Think about it. Even if every internal data process is optimized, all of your data superbly governed and perfectly mastered, with great executive buyin for data management, you have actually only solved a small percentage of your data problems. The simple truth

--------------------Rules -------- -------------------------------

is that the vast majority of our data processes are dependent on interactions with outside sources, whose data management strategies we can’t control. Achieving the necessary foundation of data management that allows industry to adapt to new drilling, production, construction and operations methods, or new analytics technology will require us to come to consensus on some fundamentals. We have to agree what “good” data looks like – for all stakeholders in the life cycle. We have to agree what our most common words mean – what is a well? A completion? A spud date? This is why industry standards are the foundation of effective data management for industry: they get all of us to a common footing with data. At the PPDM Association, we have been working towards these ideals for most of our 25 year history. The data model is a collection of terms and definitions that are currently expressed in relational form. The Rules Library provides the foundation for describing “good” data. Most data rules fit within a simple, common sense framework that supports human understanding, training, unambiguous testing for compliance, and metrics that allow us to benchmark and measure our progress along the journey to getting our data in order so that new technology, whether digital or field based, can be deployed and leveraged. Most of us know that every measured value obtains its contextual usefulness

26 | Journal of the Professional Petroleum Data Management Association

from an associated unit of measure. This is a fundamental concept of data management. Following on, it is logical to state a principle that depths, diameters and distances should all have units of measure in the length domain. More, it’s reasonable in our industry to adopt the data rule that well depths should either be feet or meters – other units of measure are unlikely and probably wrong. From accepting these ideas, it becomes relatively straightforward to write technical expressions of the data rule that test every well depth in a data system for compliance. The PPDM Rules Library is a collection of statements about how data should behave. Far from being a competitive advantage to a company, these rules allow us to communicate our expectations about data quality and completeness clearly and effectively among all stakeholders at all stages of the life cycle. We can insist that our service companies satisfy our expectations, and that our software vendors support and enforce these rules. Preparedness means creating a solid foundation from which industry can take wing with new technology. Let’s not be held back by outdated notions about data rules that hold everyone back. The more we share, the more we benefit. Get involved! More at rules.ppdm.org. About the Author Trudy Curtis is the CEO of the PPDM Association.



ORD EM E P RO GR

Data Managers Look To Geothermal Reservoirs E

S

S

O

By Jess Kozman, PPDM Association

D

ata managers had a front row seat as new drilling and completion techniques in shale created new data types. They could be directly involved in the changes in data definitions, volumes and procedures that came with the move to unconventional reservoirs and shale gas, especially when public interest drove transparency about the data used for and generated by hydraulic fracturing. Data managers have always been impacted by advances in exploration and production technology. For instance, seismic data loading

processes had to change when 3-D survey data became the norm. Therefore, they should be able to adapt to the changes that will come from the latest instability in the industry. As the petroleum industry prepares for an extended “lower for longer” commodity pricing scenario, other parts of the energy and resource sectors need the data manager’s expertise and knowledge of optimum accepted practices (“best practices”). Geothermal energy, gas storage, environmental and civil engineering, carbon sequestration

28 | Journal of the Professional Petroleum Data Management Association

and mining bring new data types to the data manager’s desktop. A data manager who has job anxiety may want to consider a less volatile sector or an industry that is less environmentally intensive. In any such shift, the data manager should consider the lessons from previous advances in technology. Many industries need management and analysis of the same technical data types as the oil and gas sector. Let’s consider, for example, the world of geothermal energy. As you drill into the earth, the rock temperature rises. We see the evidence especially in hot springs and geysers; they have been used for bathing and home heating for thousands of years. In recent decades, advances in technology have made it feasible to use this heat to generate electricity through a heat pump or steam turbine. The United States, Philippines and Indonesia presently lead the way in terms of installed geothermal electric capacity. Geothermal energy provides more than a quarter of the total electricity in Iceland, Philippines and El Salvador. Many other countries have geothermal generation or are exploring the potential. Geothermal energy is contained within the rocks and water in the earth’s crust. The economic potential for using this energy depends on many factors. The local thermal gradient determines how deep the wells should be. The amount and accessibility of water influence the efficiency of heat transfer into the wells. The technical data needed to evaluate a reservoir’s geothermal potential is very similar to that for petroleum reservoirs. For example, a feasibility study proposed in 2010 for Singapore called for log and core data from a pilot wellbore, reservoir modelling, gravity and geochemical data, seismic surveys, transmission electron microscopy of the reservoir rocks, and deep wellbore design. The target is a reservoir of naturally fractured granite. The well infrastructure would circulate water into the hot reservoir and return it to the surface as steam for power generation.


Feature

Fractured basement rock visible in outcrop at Dairy Farm Quarry in Singapore (left) and near Vung Tau, Vietnam (right). Both are interesting stops for geology field trips for data managers. Oil and gas data managers already have the tools for dealing with a large influx of data like the one envisioned in the Singapore project. They know how to adapt to the continuous and profound changes in technologies that generate oil and gas technical data. Consider, for example, the changes in data models and workflows that came with longreach horizontal drilling and hydraulic fracturing in shales. Data managers had to devise, validate and collaborate on new data standards, extensions to data models, and quality control processes to manage these new data types. Many of these business rules are now embedded in content supplied or endorsed by PPDM. As data managers we are quite familiar with testing the robustness of our data systems for adaptability to new data types and uses. The opening of plays where the reservoir rock is not a typical carbonate or sandstone, but instead a fractured zone in granite bedrock, is only one of the frontiers that brings new data types. While the idea of basement rocks as a petroleum reservoir is not new, methods for acquiring, managing and analyzing the data on the extent and orientation of fractures have now been field tested and verified around the world. These data management processes can now be applied to fields outside of oil and gas, and in reservoirs that do not require hydraulic fracturing. The naturally fractured Bukit Temah granite is an excellent heat reservoir and a target for the geothermal energy pilot project in Singapore that was proposed in 2010. The pilot project includes a 3,000 meter well using petroleum drilling technology, and the proof of concept

will use oilfield horizontal drilling techniques. A large part of the success of such a project will be the quality of the data collected from this well. The target fractured basement rocks that underlie Singapore bear many similarities to the outcrops that give us hands-on knowledge of other fractured reservoirs. Data managers have used geology field trips to outcrop sites to understand the relationship between reservoir scale data and the heterogeneity of the rocks in situ. This is part of the process to develop data management strategies for the information that would come from the geothermal test wells. Singapore, with no oil and gas reserves of its own, will need professional management and analysis of the project data. The best place to look for that expertise is among the professional data managers who have worked in the Singapore area with oil and gas operators. Data management facets for the pilot project include providing metadata on geologic and measurement uncertainty, standardizing units of measure, combining data from multiple scale sources for data driven analytics, and formats for sharing data. This would also be an interesting test case for regulatory data submission, as it is not even clear which government agency would have jurisdiction over wells drilled for subsurface energy usage. As more countries evaluate the potential of subsurface resources besides hydrocarbons for their energy needs, the demand for professional geotechnical data managers should expand. And for a profession that is so highly impacted by radical swings in commodity pricing, that is good news.

About the Author Jess Kozman is the Asia-Pacific Regional Representative for PPDM and the Managing Director of CLTech Consulting Pte Ltd in Singapore REFERENCES: http://www.innovationmagazine.com/volumes/ v10n2/coverstory1.html http://esi.nus.edu.sg/docs/event/oliver-egs-hsaconcept-for-singapore-ppt-sg$-ppt-for-chevron.pdf

IMAGE CREDIT 1: National University of Singapore, Dept of Civil & Environmental Engineering Department of Geography, “Geothermal Power Concept for Singapore�, Palmer et al. profile.nus.edu.sg/fass/ geogo/

IMAGE CREDIT 2: http://www.iagi.or.id/wp-content/ uploads/2014/03/Flyer-field-trip-Vietnam-_IAGI2014.pdf

Foundations | Vol 3, Issue 1 | 29


Standards & Technology

S&T Thanks to

our Volunteers

Jennifer Bell February’s Volunteer of the Month was Jennifer Bell. Jennifer Bell has worked in the oil and gas industry for over 15 years, with a good portion of her time on and off the clock leading industry efforts for better reliability engineering and asset integrity management practices. Jennifer holds a Bachelor of Science in Mechanical Engineering from Texas Tech University and in 2012 returned to school full-time to pursue both her MBA and Master of Engineering. Bell held a variety of positions at Chevron in the areas of operations assurance, asset integrity management and reliability engineering. Currently, she consults part-time at Maris Environmental in the disciplines of HSE Risk and Asset Integrity Management. Bell has received numerous awards, including the ASME Herbert Allen Award for outstanding technical achievement by an engineer 35 years of age or younger and the Young Engineer of the Year Award from the SPE. Since returning to graduate school, Jennifer has given back to the engineering community at the University of Houston (UH) and Houston Community College (HCC). She has held a variety of volunteer positions at UH, including the Society of Women Engineers Student Section Industry Adviser. She currently supports the University of Houston Petroleum Technology Initiative (PTI) as the Technical Consultant to the Director of PTI. Jennifer is also an Adjunct Professor at the Houston Community College. She joined HCC in the fall of 2013, around the time they launched their Global Oil and Gas Drilling Training Center. The RigOne Program was formed as part of the Global Oil and Gas Training Center for the purpose of training the next generation of entry-level drilling crewmembers to be promotable,

retainable and safety conscious. Jennifer developed the first RigOne course, PTRT 1001 – Introduction to Petroleum Industry, and has taught several classes in the Department of Petroleum Engineering Technology at HCC. After attending a PPDM luncheon, Jennifer quickly saw that the visions and teachings of PPDM could benefit her students and, as a result, has encouraged her students to join the PPDM Association. She will use much of the PPDM programming to educate her students this spring. “We have named Jennifer as our PPDM Volunteer of the Month because of her dedication to preparing students for career paths in the oil and gas industry. She has opened many doors for us, helping us to reach out to students in the Houston area,” said Pam Koscinski, PPDM Association.

agency and keeping a dialogue open as PPDM members and representatives move in and out of her location. She is wonderful to work with and has made several of our recent luncheon events possible. I am privileged to work with Alison and look forward to continuing to develop our Brisbane community with her,” said Jess Kozman, PPDM’s Asia Pacific Representative.

Thank You From the Rules Team at PPDM we would like to express our gratitude to

Ted Wall

of Calgary, Canada for his contribution of Land Rules.

&

Jason Medd Alison Troup The PPDM Association’s activities extend around the globe, as does our amazing volunteer base. March’s recipient, Alison Troup, is one of the founding members of our Eastern Australia Leadership Team. Alison is a geoscientist at the Geological Survey of Queensland, where she’s been a part of the Energy and Petroleum and Gas teams since 2010. Prior to that, she was been a student geologist at both Santos Ltd. and Gold Fields Ltd. Alison earned her Bachelor of Sciences (with Honours) in Geology at the University of Queensland in 2009. “Alison has been instrumental in building our Eastern Australia Leadership Team as we work to keep the Brisbane and area data management community together and strong through these challenging times. She has been generous with offering the facilities and resources of the local government

30 | Journal of the Professional Petroleum Data Management Association

of Informatica in Dublin, Ireland for his assistance for the design and content of rules.

2,800+

Rules and Growing Visit us at rules.ppdm.org

Volunteers Needed

The Rules Team is looking for Subject Matter Experts in any E&P topic specifically: Well Logs Directional Surveys Land

Interested? Please contact

projects@ppdm.org


Upcoming Events

APRIL 2016 S M T

SYMPOSIA APRIL 11-12, 2016 HOUSTON DATA MANAGEMENT SYMPOSIUM & TRADESHOW

AUGUST 4, 2016 PERTH DATA MANAGEMENT SYMPOSIUM

OCTOBER 25-26, 2016 CALGARY DATA MANAGEMENT SYMPOSIUM, TRADESHOW & AGM

Houston, TX, USA

Perth, WA, Australia

Calgary, AB, Canada

LUNCHEONS

3 10 17 24

4 11 18 25

5 12 19 26

MAY 2016 S M T 1 8 15 22 29

2 9 16 23 30

3 10 17 24 31

JUNE 2016 S M T

APRIL 19, 2016 FORT WORTH Q2 DATA MANAGEMENT LUNCHEON

MAY 3, 2016 DENVER Q2 DATA MANAGEMENT LUNCHEON

JUNE 14, 2016 HOUSTON Q2 DATA MANAGEMENT LUNCHEON

Fort Worth, TX, USA

Denver, CO, USA

Houston, TX, USA

APRIL 21, 2016 MIDLAND Q2 DATA MANAGEMENT LUNCHEON

MAY 24, 2016 TULSA Q2 DATA MANAGEMENT LUNCHEON

JUNE 2016 CALGARY Q2 DATA MANAGEMENT LUNCHEON

Midland, TX, USA

Tulsa, OK, USA

Calgary, AB, Canada

MAY 2016 PERTH Q2 DATA MANAGEMENT LUNCHEON

JUNE 7, 2016 OKC Q2 DATA MANAGEMENT LUNCHEON

JUNE 2016 BRISBANE Q2 DATA MANAGEMENT LUNCHEON

Perth, WA, Australia

Oklahoma City, OK, USA

Brisbane, QLD, Australia

CERTIFICATION - CERTIFIED PETROLEUM DATA ANALYST JUNE 15, 2016 CPDA EXAM

NOVEMBER 2, 2016 CPDA EXAM

(Application Deadline May 5, 2016)

(Application Deadline September 21, 2016)

ONLINE TRAINING OPPORTUNITIES PPDM announced that it is supporting its membership by discounting online training courses! Online training courses are available year round and are ideal for individuals looking to learn at their own pace. For an in-class experience, private training is now booking for 2016. Public training classes are available on demand.

All dates subject to change.

VISIT PPDM.ORG FOR MORE INFORMATION Find us on Facebook Follow @PPDMAssociation on Twitter Join PPDM on LinkedIn

5 12 19 26

6 13 20 27

7 14 21 28

UE W

T

F

S

6 13 20 27

7 14 21 28

1 8 15 22 29

2 9 16 23 30

W

T

F

S

4 11 18 25

5 12 19 26

6 13 20 27

7 14 21 28

W

T

F

S

1 8 15 22 29

2 9 16 23 30

3 10 17 24 31

4 11 18 25


WITHOUT knowledge action IS USELESS AND knowledge without ACTION IS futile. Abu Bakr

Power your upstream decision-making with customer-driven data, integrated software and services from geoLOGIC. At geoLOGIC, we help turn raw data into actionable knowledge. That’s a powerful tool to leverage all your decision making, whether it’s at head office or out in the field. From comprehensive oil and gas data to mapping and analysis, we’ve got you covered. Get all the knowledge you need, all in one place with geoLOGIC. For more on our full suite of decision support tools, visit geoLOGIC.com

geoSCOUT

|

gDC

Upstream knowledge solutions


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.