Foundations, Q22015

Page 1

Foundations Journal of the Professional Petroleum Data Management Association

Print: ISSN 2368-7533 - Online: ISSN 2368-7541

Volume 2 | Issue 2 | 2Q2015

1st Place Foundations Photo Contest Winner; John Heerema, “Morning near Sunwapta Pass”

Beauty and the iBeast How Landmen Work to Cut the Knowledge Gap: Does your information work for you, or is it the other way around? (Page 7) PLUS PHOTO CONTEST: This month’s winners and how to enter (Page 22)

Log on and check out our new EPIC website at ppdm.org


Success up here

depends on what you know down here. Subsurface intelligence is everything. Understanding the subsurface is everything when producing unconventional assets. EnergyIQ delivers E&P Data Management Solutions that enable companies to exploit unconventional plays with less risk. Learn how our Trusted Data Manager (TDM) application suite can help you value, acquire, and produce assets with greater certainty, speed, and collaboration. Visit www.energyiq.info, or call us at (303) 790-0919.

2329 W. Main Street, Suite 300, Littleton, CO 80120 • (303) 790-0919


Foundations Foundations: The Journal of the Professional Petroleum Data Management Association is published four times per year.

Table of Contents Volume 2 | Issue 2 | 2Q2015

COVER FEATURE

Beauty and the iBeast How Landmen Work to Cut the Knowledge Gap: Does your information work for you, or is it the other way around? By Wade Brawley

CEO Trudy Curtis Senior Operations Coordinator Amanda Phillips Community Development Coordinator Bryan Francisco Article Contributors/Authors Wade Brawley, Gord Cope, Jim Crompton, Guy Holmes, Jana Schey Editorial Assistance Dave Fisher Illustrator Jasleen Virdi Graphic Design

BOARD OF DIRECTORS Chair Trevor Hicks Vice Chair Robert Best Secretary Janet Hicks Treasurer Peter MacDougall Directors Trudy Curtis, Rusty Foreman, Paul Haines, David Hood, Allan Huber, Adam Hutchinson, Christine Miesner, Joseph Seila, Paloma Urbano

FEATURES

What is Master Data Management And the Top Five Reasons MDM Projects Fail By Guy Holmes

No More ‘Crasternation’ 24 Raster Log Calibration: A New solution to an Old Well Log Problem By Gord Cope, featuring David Baker, Trudy Curtis, and Jim Williams

Data Professionals Gain 27 Many Kinds of Value from PPDM Training PPDM takes an in-depth look at the value and experience of its in-class training program By Gord Cope

DEPARTMENTS

PPDM Establishes Leadership Teams in Houston and Oklahoma City

SLC Corner SLC Update: Houston & Stavanger Public Forums

15

SLC Spotlight

16

A spotlight on Energistics, one of the Standards Leadership Council’s founding member organizations By Jana Schey

Photo Contest This quarter’s winners and how YOU can get your photo on the cover of Foundations

Young professionals can get the “social” value out of “social media,” but how can we get business value out of these technologies and services? By Jim Crompton

Upcoming Events

4 Developing an Information Management Culture

11

A look into the diverse leadership teams that assist with the event planning, promotion and on-site support of local PPDM events

GUEST EDITORIAL

Find us at events and conferences around the world in 2015

Training Opportunities

22

30

30

Online, private and public training opportunities

Head Office Suite 860, 736 8th Ave SW Calgary, AB T2P 1H4 Email: info@ppdm.org Phone: 403-660-7817

18

7

ABOUT PPDM

The Professional Petroleum Data Management (PPDM) Association is a global, not-for-profit society within the petroleum industry that provides leadership for the professionalization of petroleum data management through the development and dissemination of best practices and standards, education programs, certification programs and professional development opportunities. PPDM represents and supports the needs of operating companies, regulators, software vendors, data vendors, consulting companies and management professionals around the globe. Print: ISSN 2368-7533 - Online: ISSN 2368-7541

Foundations | 2Q2015 | 3


Guest Editorial Developing an Information Management Culture By Jim Crompton, Data Management and Analytics Consultant

I

f you think it is hard to deploy a new technology in your company, try changing the culture of your company. There is a lot of emphasis being put on improving access to trusted information, to drive high quality, data-driven decisions, and rightly so. But this goal is more than just the adoption of new systems of record, new information visualization tools and new analytical capabilities. We also need to develop an information culture that can appreciate and take advantage of the new digital assets. In other articles, I have talked about the digital literacy of the new digital engineer coming into the industry. Growing up with the Internet and being fully indoctrinated into social media is a good start, but it is not enough to reach the desired information culture. It is not enough to know how to use tools to find information if data producers don’t take the right steps to make that data available. Young professionals can get the social value out of social media, but how can we also get business value out of these technologies and services? Companies have enterprise IM Principles, documented IM Strategies, data quality initiatives and many, many data management projects going on. Those are necessary to help build a data foundation, but by themselves are not enough to reach the desired information culture. Companies are developing training to

4 | Journal of the Professional Petroleum Data Management Association

help build their organizational capability around information management. These are critical steps, but again, alone they will not bring us to the promised land of an information culture. So what more is it going to take? For me, the critical list includes: • The Data Foundation, which includes structured, documents, transactions, models and field automation measurements • The Data Storage infrastructure that has to manage over many petabytes of structured data, and hundreds of millions of documents and volumes that are growing rapidly • A Data Access approach, which includes search, query and visualization • Reporting and Analytical Tools, from advanced analytical and modeling tools, to data labs and even Big Data approaches • Increased Organizational Capability But most importantly, it includes a change of mindset about the importance of data as a business asset, an understanding of the value of this effort and a practical plan to put all the necessary solutions in place. A scramble drill to react to the latest data crisis is not a plan. Not all data is equally valuable, not all data is needed in every decision, not all data needs to be perfect, but we need to know what data is needed when, by what people, in what format, to make better decisions. I recently read an interesting


article by Malcolm Frank and Geoffrey Moore about the *Future of Work. The authors suggest that we need to move from systems of record to systems of engagement. Their examples of systems of engagement included the following: • An enterprise Facebook that allows colleagues across the enterprise (including partners and customers) to publish their expertise and allows the community to validate it • An enterprise Google that lets people search secured databases in conjunction with accessing public ones • An enterprise Wikipedia that lets corporations publish their core intellectual property to their own teams without exposing it to competitors • An enterprise YouTube that lets technical managers explain complex topics via a medium that is compelling and effective • An enterprise Twitter that helps people in crisis situations keep the enterprise abreast of late-breaking news To be fair, some companies have a start at several of these systems of engagement. However, we still have a long way to go with the systems of record data foundation and efforts to get a handle on data quality. We still have a lot of catching up to do with staffing shortages, as some of our long-serving data management experts are considering retirement and the growth of data management requirements continues unchecked. Unbelievably, some companies

are reacting to current economic tightness with staff cuts. Will they never learn? For most efforts that I have come across, the industry’s social media technology strategy is still too focused on technology and not on how to tie systems of engagement to our core business processes. Yes, 1000s of folks can share what they think of the latest news and give personal comments on just about anything, but most can’t yet reliably find what they need to do to do their jobs better. Yes, we have many effective global communities of practices, but many people are still not connected to the communities they need. Our business, technical and IT leaders are still trying to figure out what it means to be an information leader. The new Operators Advisory Panel for the Standards Leadership Council is a good start. Our technical and operational staff are struggling to find the data they need and wondering if they have found all the relevant data and what quality state it is in, while company legal staff worry if searching is finding data that it shouldn’t have access to. We focus too much on our needs as data consumers and not enough on our responsibilities as data producers. There are management responsibilities, technology responsibilities and individual responsibilities involved in the new information culture. As I said at the beginning, it is hard to change a culture. There are many moving parts and the current culture has many

ways to reinforce the current state. But without changing the way people think about data, we can’t change the way they act about data. If we can’t change behaviors in the right direction, no new technology tool is going to magically do that for us. The desired goal of an information culture still eludes us. We are an industry that invests a lot of money on acquiring and processing data to help us make complex and expensive decisions. We should be doing better.

*Frank, Malcolm & Moore, Geoffrey. The Future of Work: A New Approach to Productivity and Competitive Advantage. Cognizant. Dec. 2010. Web. 4 May 2015. <http://www. cognizant.com/InsightsWhitepapers/ FutureofWork-A-New-Approach.pdf>.

About the Author Jim retired from Chevron in 2013 after almost 37 years with a major international oil and gas company. After retiring, Jim established Reflections Data Consulting LLC to continue his work in the area of data management, standards and analytics for the exploration and production industry.

Foundations | 2Q2015 | 5


Business advisory and technology services for next-gen oil and gas

+Business Alignment +Process Optimization +Technology Enablement +Managed Services +ENERHUB™ +Peloton Services Ž

In today’s dynamic energy market, oil and gas companies are looking to achieve the optimal balance: retaining the ability to move quickly and decisively on emerging opportunities while maintaining leaner, more efficient operations. Stonebridge Consulting provides business advisory and technology services that enable oil and gas companies to get lean and stay lean through improved operational performance across the entire enterprise.

www.sbti.com S E R V I N G

C L I E N T S

800.776.9755

T H R O U G H O U T

N O R T H

A M E R I C A


Cover Feature

Beauty and the iBeast By Wade Brawley, President, Land Information Services, LLC

D

oes your information work for you, or is it the other way around? This may sound a little like the chicken and the egg parable, but there is a right answer. We create information by compiling data. Left unbridled and without a plan, data holds little value. But when wisely channeled into logical collections and indexed with attributes that enable the data to fit together and make associations, data transforms into meaningful information. And information can be powerful.

Data is not information and information is not data. Data is simple, one-dimensional, isolated and unintuitive – not very flattering words to describe a good friend. Data represents the building blocks for information, so it is extremely important for data to have accuracy and integrity in order to make a successful transformation. Your “information beast” is only as healthy as the quality of data you feed it. Isolated bits of data are not very useful until they are woven together into meaningful information

that helps the customer make a decision. Any incorrect data component may lead to an incorrect assessment.

THE iBEAST Information can be powerful. When data makes the transformation into information, it is like a creature in itself: the iBeast. Once the iBeast is born, it must be nurtured, cultivated and improved – much like raising a child. As the parent or steward of the iBeast, it is critical that you guide, direct and discipline your iBeast.

Foundations | 2Q2015 | 7


Feature

CONSISTENCY

TRANSPARENCY

ROLES

TRAINING COMPLIANCE EFFECTIVENESS CAREER CORRECT DATA

REVIEWS EFFICIENCY QUALITY DATA

CONTINUITYPRODUCTION COMMITMENT PRODUC TIVIT Y

DUPLICATIONS

ACCOUNTABILITY

WEAKNESSES SLOWER

DATA STANDARDSREPORTS

S PROCES

PERFORMANCE

DOCUMENTS

A DAT ECT R R O INC PES F EO US S I M

RULES W O R K FLO W BUSINESS

D OR REC

R CO

R

S ION

T EC

QUALITY

PROCESS EFFORTS

TY

PERFORMANCE

CONTROL

BEST TURNOVER

STRESS

RISK

ERRORS

MISTAKES COMPLEXITY

FAULTY

You will need to correct it on occasion and modify the environment that caused the iBeast to behave inappropriately. If you’re a good steward, you will strive to find ways to improve and grow your iBeast. As your iBeast goes out into the world (your company and customers), it will be a reflection of the steward who raised it and trained it – YOU. Once you decide to give life to your iBeast, you must also make a commitment to monitor, tweak, expand and improve your iBeast. If you neglect your iBeast, it will certainly disappoint you and even sabotage your career as it tells lies and causes your customers to discredit your ability as a steward of information. I offer these bits of advice to anyone ready to take charge of his or her career by accepting stewardship of the iBeast and delivering a valued service. The examples are from the Land office, but can be applied throughout your organization.

THE CHANGING WORKPLACE Technology demands a different style of management in the Land back office.

When I see a stack of documents handed from one person to another for review and approval, I see a prolonged process disrupted by duplication, bottlenecks and misplaced documents. For many years, physical delivery was the only way for the manager to check for completeness, accuracy and employee productivity. These are important objectives, but achieved through an imperfect methodology. The manager would open each file and cross check the source document with the system input, then mark up the report or datasheet. It would then be returned for correction or initialed for approval. Astute managers created spreadsheets to track these documents, review performance and report to management. However, the landman, analyst, and payments clerk probably created their own spreadsheets. Each thought they were ahead of the curve, but their devotion to spreadsheets could have been better spent on their unique roles within the shared process. Moreover, each sequential role resulted in a slower overall process time. As the

8 | Journal of the Professional Petroleum Data Management Association

documents flowed through the supply chain, a bottleneck at one role led to an avalanche of documents for the next role. What if you could (and we can) pinpoint the place where the document is initiated and create a workflow that everyone could use to mark the completion of their task and the location of the document? If we all play in the

PPDM Rules 833 - Land area net size must not exceed gross size. 873 - A land right must not be undeveloped if the lease has producing status. These are two of many PPDM data rules, and we need even more! Visit the “Rules App” at rules.ppdm.org to see the rules and find out how to contribute.


Feature

same sandbox, everyone benefits: • The manager has a tool for monitoring the performance of each person. • Everyone in the supply chain can track documents to find bottlenecks. • Transparency keeps everyone accountable – even the manager. • Alerts are sent simultaneously so that more than one person can perform their task instead of waiting on someone else. • Compliance to procedures is measured and reinforced. • Training is easier. • Everyone devotes more time to their role and even adds value in other areas. This is all great, but we can do even better. We can define rules for what is expected so that faulty documents are trapped. These include such problems as missing or incongruent dates, unacceptable provisions, and acres and bonus that do not reconcile. Now the manager is not spending time catching mistakes and our carelessness is not broadcasted for all to see! The scenario above works for any kind of land document processing – agreements, assignments, regulatory notices, proposals, etc. It is especially effective for leases. You can integrate this workflow with your lease records system so the basic information is uploaded upon the landman’s approval of terms and form and the payment analyst’s input of the payment information. This frees up your analyst to focus on analyzing provisions and reconciling acreage instead of keying names and dates. Let’s crank

it up another notch by indexing each record in the workflow with a lease form that has preset provision mapping. Once leases are in the system, there may still be errors due to record edits or there may be exceptions to uploaded records. Your lease records manager can’t review every edit. Instead, the data steward should define quality control points so the manager can monitor exceptions to business rules for system use and data standards. You can’t stop errors from occurring; however, you can anticipate errors and find and fix them. You do this by creating quality control queries to find instances of, for example, net acres higher than gross acres, or gross acres higher than mapped acres (true acreage content derived by GIS). The first time you run the query, it may take awhile to correct all anomalies and address any weaknesses in the process in order to prevent future errors. After the initial effort, and assuming the manager’s ongoing commitment, the list of exceptions will remain small and easily managed. Fix the errors before your customers find them and form a lasting judgment about you.

QUALITY CONTROL POINTS It’s unreasonable to expect the manager to review every edit, but you should expect the manager to manage by exception. Development of, and commitment to, quality control points is the only way the data steward will earn customer confidence, reinforce compliance and proliferate job enrichment. I have seen as few as 10 quality control points in one company to

as many as 150 in another. To find the right number for your company, consider the complexity of the system, the amount of validation built into the system and the extent to which your company serves up the data. Here are a few important quality control points I recommend for every Land department to monitor: • Net acres more than gross acres • Gross acres more than mapped (or logical) gross acres • Lease Status = Producing, but acreage categorized as Undeveloped • Lease Status = Producing, but no well related to Lease • Lease beyond its primary term, but no well related to any part of the Lease • Well Billing Interest and Net Revenue Interest anomalies • Record type and status contradictions: Beware of contradictory record types Land systems are designed for flexibility in customer use and in the definition of record types and lease statuses. It is very likely that an analyst will inadvertently misuse record types and statuses – especially when there are multiple codes and statuses for any given record. When there is turnover in Land Administration middle and upper management, the new manager may introduce a new set of rules without considering the impact on data created under the old rules. We can find these contradictions by building a table that contains all possible permutations of the various codes and statuses, then marking which of these combinations are valid. For example, a database with 15 lease ownership types and 10 lease

Foundations | 2Q2015 | 9


Feature

status codes yields 150 combinations. Some of these combinations are incorrect; e.g., a Lease Ownership type of “Right of Way” with a Lease Status of “Producing.” A query would find all records that match the invalid combinations. Technology helps the manager to reinforce business rules, ensure proper utilization of the system and identify training opportunities. For the system to provide this kind of governance, it must be carefully implemented. This is the perfect time for the company to define the terms and the granularity to which they will input and maintain data. Look ahead; turnover of managers, landmen and analysts creates vulnerability in data

quality UNLESS careful thought has gone into the company’s data governance. In other words, allow the system to provide “guardrails” for practices and procedures so data quality is not vulnerable to natural shifts in personnel and management.

IN CLOSING Your organization can improve efficiency, effectiveness, competitive performance and talent retention by leveraging technology to identify data quality issues and reinforce business rules. Consider tomorrow’s needs of your company and not just today’s needs by acknowledging that the person responsible for your data today will likely not be the

same person two or three years from now. Turnover of managers creates vulnerability in processes and data quality. In short, nurture and train your iBeast so that it remains faithful and loyal, despite unavoidable turnover. Think of your iBeast as your legacy that will live on after you have left the company.

About the Author Wade Brawley, President of Land Information Services, LLC is a management consultant specializing in land information and workflow solutions for the oil and gas producer.

2015 PERTH DATA MANAGEMENT SYMPOSIUM AUGUST 5 - 6 Parmelia Hilton

ppdm.org

Join the action! Register to Attend Become a Sponsor Visit PPDM.ORG

10 | Journal of the Professional Petroleum Data Management Association


Standards & Technology

NEWS

HOUSTON

S&T

OKLAHOMA

LEADERSHIP TEAMS PPDM Establishes Leadership Teams in Houston and Oklahoma City As part of its long-term strategy to develop the community of data managers, the PPDM Association has launched leadership teams in Houston and Oklahoma City. This leadership nucleus will work with regional communities to ensure that PPDM is serving the needs of each region. The Houston Leadership Team was created during the fall of 2014 and members were instrumental in planning

for the Data Management Symposium held in Houston in March 2015, as well as serving as moderators and facilitators for many sessions. Additionally, they have been key in securing locations, sponsors and speakers for the Data Management Luncheons to be held in Houston during 2015 on a quarterly basis. The Oklahoma City Leadership Team was recently created and will be involved in event planning for the quarterly Data Management Luncheons as well as the Data Management Workshop, which will be held on August 11, 2015. Please meet our team members:

HOUSTON PPDM LEADERSHIP TEAM

Madelyn Bell Retiring after 36 years with ExxonMobil, Madelyn started Data Gal LLC, which focuses on E&P Data Management practices, processes and data. She is 3 years into this new career phase and has had the opportunity to work with 10 companies, most in conjunction with Noah Consulting. Free time is spent Foundations | 2Q2015 | 11


Standards & Technology

NEWS working on the PPDM Rules project (have you contributed yet?).

Cindy Cummings – Secretary As Technical Physical and Electronic Data Coordinator for Repsol USA E&P, Cindy is responsible for all physical and electronic data and its lifecycle management for the North American office. She is responsible for seismic, well, regional geological, land and corporate data, including retention schedules and policies and the implementation of Documentum and other data management applications.

Jeremy Eade As the Subsurface and Wells Data Management Lead at BP, Jeremy is accountable for management of the subsurface and wells data asset for BP Lower 48 Onshore. He has over 20 years of experience in E&P data management. Jeremy and members of his team have spoken at a number of PPDM events.

manages well data and databases for both exploration and development geology as well as provides application support for geological applications.

Michael Higginbottom Michael is a Senior Business Analyst with EOG Resources. His current role requires him to work directly with the G&G customer base to bring their content into a centralized data mart. This data, such as Microseismic, Geochemical Analyses, Geosteering and Core analysis types, are made available throughout the company via EOG proprietary and third-party applications. He is also focused on replacing the current G&G Data Mart with a comprehensive G&G Well Master and integrating with the Enterprise MDM solution.

John Jacobs John serves as the Well Data Systems Manager in Global Engineering and Operations and HSE Systems for Anadarko Petroleum Corporation. He and his team manage Anadarko’s well data systems, including internal and third-party data sources. John has been in the exploration and production industry for over 35 years.

Heather Gordon Heather is a Geological Technician with Freeport-McMoran Oil & Gas LLC. She

12 | Journal of the Professional Petroleum Data Management Association

Steve Jaques As the VP – Information, Steve is responsible for Data Management, Geographic Information Systems, Software Development and IT Infrastructure at Laredo Energy. Laredo Energy is a small E&P company that discovers new opportunities by drilling, completing and producing horizontal wells with multi-stage fracturing techniques. In addition to these high-tech operations, the company is engaged in all major business functions associated with upstream oil and gas, such as 3-D based geoscience, land acquisition, accounting and finance.

Kimberlyn Lucken Kimberlyn is the Manager of Well Data Resources at Sanchez Oil and Gas. She is responsible for the management of all well data generated/utilized throughout the entire well lifecycle for all assets handled across four business units within Sanchez. This is her 10th year with the company.

Ellen West Nodwell - Chair Ellen opened the consultancy and


Standards & Technology

services function of IntegraShare Solutioneering in mid-2013 and specializes in geoinformation governance and data management. She provides strategies, solutions development and strategic consulting plus mentoring and bridging with educational institutions to help new graduates enter the industry.

Ali Sangster Ali currently serves as Director, Well and Production Information at DrillingInfo. In this role, she is responsible for managing well data quality and partnering with clients, the DrillingInfo Executive Leadership Team, Products Team and Development Team to ensure the DrillingInfo foundation and solution is on track and consistent with the industry’s constantly evolving needs, demands and standards.

Benny Singh Benny has over 16 years of Information Technology experience in the oil and gas industry. Benny currently serves as Data Warehouse/BI Team Lead at Murphy Oil and is responsible for the development and operation of Murphy’s Enterprise Data Warehouse (EDW). Modelled using PPDM as its foundation, EDW provides a single, integrated data repository of all business critical well lifecycle information to facilitate better business

S&T

decisions. Additionally, Mr. Singh’s team is responsible for promoting an environment where users utilize BI tools of their choice, facilitating faster delivery of solutions for the business.

Ray Wall Ray currently serves as IT Director, Global Subsurface Information Management at ConocoPhillips. In this role, his team is responsible for the execution of global information management projects, providing expert consulting across the global business units and building subsurface information management competency and community.

OKLAHOMA CITY PPDM LEADERSHIP TEAM

Mark Brannon – Secretary As the Managing Director, Mark oversees Stonebridge’s business operations in Oklahoma City. On a daily basis he manages local client relations in the upstream, midstream and downstream sectors to include business development, partner management and overall project delivery.

Wade Brawley Wade Brawley is President of Land Information Services, LLC, which provides consulting for land process management, application development and integration, as well as outsourced land administration services.

Michele Compton - Chair Michele leads the Information Management team at American Energy Partners, which provides business, content, data, geoinformatics and technical services to the organization. Responsibilities for this team include master data management used across systems; business process design; content and records governance; GIS services; software development, implementation and support; and business intelligence, analytics and reporting.

David McMahan As Manager of Strategic Technical Services, David’s role is to effectively manage exploration data procurement initiatives, provide multi-discipline special data project deliverables and expand interdepartmental and asset team Foundations | 2Q2015 | 13


S&T

Standards & Technology

NEWS

collaboration workflows where centralized and/or automated information is required. He manages third-party information for Exploration and as the corporate point of contact for most of these deliverables, David liaises with Information Technology to leverage the best possible performance/ value from these products.

Christine Miesner Christine serves as Manager of E&P Data Management at Devon Energy and has responsibility for data management across the Devon E&P organization. Her responsibilities include oversight of Devon’s enterprise data governance councils, data quality rule implementation and management, master data management, training on enterprise standards for data entry, and oversight of quality metrics.

Stuart Morrison Stuart formally leads the Data Delivery and Data Governance Team. He currently looks after the Data Warehouse and Business Intelligence teams, as well as the teams for supporting the ArcGIS servers and the Geoscience applications. Stuart is hands-on, as he likes to do technical and analytical work, and his passion lies in business intelligence.

Dave Rose Dave is the Database Administration Supervisor at Chaparral Energy. He and his team have been responsible for moving the PPDM data model into production as Chaparral’s data warehouse. Dave is specifically charged with loading the data warehouse and connecting the warehouse to all external systems. He also assists the business units in writing their own views of the warehouse, again by utilizing the PPDM model.

Jim Soos Information management has quickly become a business imperative that is necessary to meet the continued demand for maintaining high-quality data across the enterprise. As a Senior Principal at Noah Consulting, Jim partners with energy companies to establish best practices to improve and sustain high quality information. His focus has been on managing the design, development and deployment of information management solutions in the E&P industry.

14 | Journal of the Professional Petroleum Data Management Association

Adria Sprigler Adria is the Manager of Data Process Management at Chesapeake Energy. She is responsible for overseeing a range of process and data management activities, which include data governance, data quality, best practice research, developing and recommending solutions for improvement, developing performance metrics, obtaining leadership or stakeholder agreement, implementing solutions and monitoring post-process improvement performance.

Trent Stoker Trent is a Product Manager at Oseberg, where he works with clients and developers to look for new ways to expand product offerings and deliver them successfully. He also works to make sure new initiatives are in line with the vision of the company.


SLC Corner

SLC

SLC Corner Standards Leadership Council (SLC) Update

I

n February 2015, the Standards Leadership Council (SLC) welcomed Alan Johnston, President of MIMOSA, as the SLC’s new Co-Chair. Alan will co-chair the SLC alongside current Co-Chair and PPDM Association CEO Trudy Curtis. Alan brings a wealth of experience with open standards for operations and maintenance, and the SLC looks forward to working with him on upcoming initiatives. Alan’s SLC predecessor, Gerald ‘Jerry’ Hubbard unfortunately passed away in early 2015 and will be deeply missed.

SLC PUBLIC FORUM, HOUSTON, MARCH 19 The SLC held their North American Public Forum in Houston, Texas on Thursday, March 19, 2015, at the BP office in the Energy Corridor. This meeting was generously sponsored by BP and Schlumberger, allowing this event to have no cost for attendees to join. Those in attendance enjoyed a variety of presentations, kicking off with Jim Crompton, advisor to the SLC, discussing “Is This the Right Time for Standards?” given the current economic

climate. Grant Hartwright, from Noah Consulting, followed with his keynote address “Facilities Asset Lifecycle: the Design, Construction, Commissioning & Operation of Facilities.” Grant shared with the group his many years of experience and believes that establishing standards early on will make your life much easier. Next on the agenda was the OGI Project Update by Alan Johnston, MIMOSA, Paul Hunkar, OPC Foundation and Nils Sandsmark, POSC Caesar Association. These gentlemen provided a detailed status report on the project and indicated a public pilot is in the process for the near future. The next two sessions were panels on the Industry Perspective and the Common Data Management issues that all SLC members face, and the day was concluded by Trudy Curtis, Co-Chair for the SLC, discussing what the next steps are for the Standards Leadership Council.

(NPD) and Schlumberger, had an actionpacked lineup of speakers for the day. Robert Skaar from Statoil acted as keynote speaker with his topic “Putting the Focus on Facilities Asset Lifecycle: Design, Construction, Commissioning and Operations.” There was an update on the OGI Project, a panel on an industry perspective on standards, another panel on common data management issues that all SLC members face and a session on “Regulatory Challenges for Facility Management” by Ove Ryland, EPIM. The day wrapped up with the popular session on “Endorse, Implement or Embed” and ended with a discussion on the next steps for the Standards Leadership Council.

Presentations for both forums are available on standardsleadershipcouncil.org

SLC PUBLIC FORUM - STAVANGER, NORWAY, APRIL 29 The SLC European Public Forum was held in Stavanger, Norway on Wednesday, April 29, 2015. The meeting, sponsored by the Norwegian Petroleum Directorate

Foundations | 2Q2015 | 15


SLC Corner

SLC Spotlight: Energistics By Jana Schey, Vice President of Operations, Energistics

will help vendors test their products and help assure users that these products are WITSML-compliant. The next generation of WITSML will support streaming realtime data with updated technology. The new version will be published in Q2 2015.

E

nergistics® is a global, not-for-profit, membership organization that serves as facilitator, custodian and advocate for the development and adoption of open data exchange standards in the upstream oil and gas industry. Our members include all types of companies – integrated, independent and national oil companies, oilfield service companies, hardware and software vendors, system integrators, regulatory agencies – and the global standards user community. Serving the industry for 25 years, Energistics unites upstream industry professionals in a neutral and collaborative environment where they come together to develop these standards. Energistics data exchange standards provide an industry-defined, common data format for key data used in exploration and production (E&P) workflows. Organizations who adopt the standards can read and write the common format, which not only facilitates data transfer, but improves operational efficiency. For example, a US operator reports how use of Energistics standards allows the company to produce standardized hydraulic fracture treatment reports

that support its production optimization workflows and help meet regulatory reporting requirements. An integrated oil company reports how standards for exchanging well completion data allow it to streamline the handover between operations and well services groups to improve safety and efficiency in well workovers. A national oil company is using data exchange standards in its subsurface workflows, which makes it possible to update only those parts of an earth model that have changed, making it easier for asset teams to iterate on multiple realizations to reduce uncertainty.

ENERGISTICS FLAGSHIP STANDARDS ARE WITSML™, PRODML™ AND RESQML™.

WITSML is for drilling-related data. It was developed to promote right-time, seamless flow of well data between operators, service companies and regulatory agencies to speed and enhance decision-making and reporting. The latest version, V1.4.1, enables robust product certification testing for servers, which

16 | Journal of the Professional Petroleum Data Management Association

PRODML is for production-related data, characterized by the flow of fluids from the point they enter the wellbore to the point of custody transfer, together with the results of the field services and engineering analyses required in product operation workflows. New data objects are being developed to support specific PVT workflows and to transfer routine production data between jointinterest partners. Look for the next release in the second half of 2015.

RESQML is for transferring reservoir and earth models, enabling faster and more reliable exchanges between the many software packages used in subsurface workflows. The newest capabilities mean that E&P professionals can choose which data to exchange – from complex, comprehensive models to only the parts of a model that have changed – which dramatically improves workflow flexibility. These flagship standards rest on a foundation of the Energistics Common Technical Architecture (CTA), which means the domain standards share components for common tasks. CTA


SLC Corner

includes, among others, the Energistics Unit of Measure standard, the Energistics Packaging Convention, HDF5 for large data sets and standard naming conventions. Energistics Transfer Protocol (ETP) is an important part of the CTA, which uses WebSocket, Avro and JSON to transfer real-time and static data from a server to a client. The protocol is a simple API consisting of messages passed between

the client and server to initiate and close sessions, identify the data available on a server, initiate transfer of some subset of that data and some other functions. ETP will eventually replace the existing APIs in all the Energistics standards. To learn more about Energistics and our standards, or to get involved, contact info@energistics.org.

SLC

About the Author Jana has worked in the oil and gas industry for over 14 years and joined Energistics in 2008. In addition to providing leadership for Energistics, Jana is responsible for all Special Interest Group operations and oversees internal operations.

Gerald ‘Jerry’ Edwin Hubbard ( 1951 - 2015 ) As we rang in the New Year, we lost one of our own. Jerry Hubbard, President and CEO of Energistics, passed away in his home on January 1, 2015. Jerry played a pivotal role in the standards community, within Energistics, as founder and co-chair of the Standards Leadership Council, and as a passionate advocate of standards as practical and useful tools for industry.

Jerry will be missed by all of us.

Foundations | 2Q2015 | 17


What is Master Data Management? By Guy Holmes, Chairman, SpectrumData

I

am hearing much about Master Data Management (MDM) from all industry sectors, not just oil and gas but commercial organizations, software providers, miners, etc. On my quest to be able to talk intelligently about it over dinner with my nerdy database friends, I decided to create “MDM for Dummies.” What follows is not for dummies, but for you – the intelligent and worldly readers of Foundations. Master Data Management is the policies, standards and tools for consistent

management of an organization’s data. The basic goal of MDM is to provide one version of the key business records – a single point of reference. It includes policies and processes on managing and accessing the data from or within that single source.

PPDM VS. MDM? MDM is a concept or “way of life,” whereas the PPDM data model is far more tangible. PPDM provides the place to live the MDM dream. PPDM frequently exists where MDM does not. Conversely, MDM

18 | Journal of the Professional Petroleum Data Management Association

can never exist without a database built on PPDM or a similar specification. MDM seeks to manage all key business data across the enterprise in a single point of reference whereas PPDM has restricted its model to areas relevant to the E&P industry.

TO WHAT INDUSTRIES DOES MDM APPLY? The MDM concept is relevant to any company, whether they sell guitars or explore for oil, whereas PPDM is limited


Feature

to one industry. Because MDM is a way of life, it can be applied to any business. Consider a company that sells guitars. Their core business information might consist of the models they sell, their customers and addresses. Without a consistent system and process of managing this fundamental information, they won’t sell many guitars or be in business for very long. It is easy to see the wide reach that MDM might have. A PPDM database performs the physical function of storing core business information. Let’s consider well data. PPDM enforces a unique well identifier, a set of coordinates, total depth, etc. for each well. These foundation blocks are critical to a myriad of data and metadata that is related to this well. If information is wrong or incorrectly duplicated elsewhere, the flow-on effects can be massive across the organization.

WHAT DATA IS CONSIDERED “MASTER DATA”? Much of the data we have in our businesses probably does not qualify as master data. Well production statistics and the location of a drilling rig in transit are more transactional in nature than master. The key difference is that master data elements tend to repeat in other data sets, and therefore consistency is key to quality. Getting these building blocks right and

tightly controlling their policies provide the backbone for many other data sets. The use of the PPDM model itself has had the same obstacles as one would find getting an MDM system in place, and management buy-in for changes to fundamental systems are never easy to get. I believe this is less from the cost of doing it, and more from the distraction and disruption that management sees in implementation. PPDM has managed to convince software vendors in the oil sector to use the PPDM model as part of their underlying software solutions, and this has meant that companies end up using the PPDM model because they purchased a software product that uses the model under the hood. However, I do wonder how many companies then end up using the underlying database structure that PPDM provides as a sort of MDM central repository? How often do people using OpenWorks link the database to the Petrosys PPDM Compliant Database so the two applications share the data? In my view, the decision to join and maintain and manage that share is the start of the MDM style of thinking. In fact, the decision to integrate these two databases is fundamental to the whole MDM concept.

HOW DO YOU GET MDM BUY-IN? It is probably safe to say that most

medium to large oil and gas explorers would benefit from MDM. However, the problem most of us face is convincing management of the need for and value of an MDM solution and, more importantly, the MDM way of life. MDM is a great idea and using PPDM as part of that system is even better. But, as we all know, getting approval for something as game changing as an MDM policy and tool set is where the whole thing starts to get shaky. If you want this to succeed, biting off more than you can chew is not a good idea. This is definitely an “eat the elephant one bite at a time” moment. As with any project, start with a whiteboard, a few work colleagues who share the passion and a strong cup of coffee. See what a small-scale project might look like. Find something with a significant impact that management will see. If they don’t understand the value proposition, traction will be slow. It must make their life easier or solve a recognized business problem.

WHAT ARE SOME IDEAS FOR A GOOD MDM STARTING POINT? Let’s say an oil company intends to explore in North Africa. Start the MDM process rolling while the data is being assembled. It is a time of few preconceived ideas about the data and how to assemble it; everyone wants to

Foundations | 2Q2015 | 19


Feature

just work with the best available data. MDM with a PPDM backbone would help resolve the usual disparate data sets that start to evolve as drilling, regulatory compliance and production each create their own operational databases. Help them all show impact and make your life easier (in the long run) by getting them to use an MDM source for the key data sets in their projects. It is far harder to rope them back into a system you make if they have already started their own.

HOW DO YOU ENSURE AN MDM PROJECT SUCCEEDS AND, MORE IMPORTANTLY, GETS NOTICED? Ensuring an MDM project’s success is no different than ensuring the success of any other project. Organize the project, including scope and stakeholders, plan the work and associated schedules, set achievable objectives, gain management buy-in and manage for success.

The one thing about MDM that can ruin your chance of recognizable success is that it can solve problems by stealth. No siren goes off when you get to the end, and the organization will likely not see an immediate game-changing impact. MDM is more likely to make long-lasting differences that benefit many, get noticed by few, and reward management and the business more than the individual who kicked off the initiative.

TOP 5 REASONS MDM PROJECTS FAIL (IN NO PARTICULAR ORDER) 1.Biting off more than you can chew. Set achievable project goals! 2.Lack of support from key stakeholders. Show them some early benefits – especially on longer projects. Otherwise, project support slides, and you go with it. 3.Scope creep. Keep it lean, keep ‘em keen.

20 | Journal of the Professional Petroleum Data Management Association

4.Insufficient data governance. Start from a strong base. Garbage in, garbage out. 5.MDM flow not linked to business processes. MDM must be part of the normal business flow, not an extra sideways step.

About the Author Guy has enjoyed a successful career in the international oil and gas industry for 17 years, having held professional roles overseeing logistics, data management, information technology, data preservation and recovery. Today, Guy is the Chairman of SpectrumData and a Director of several other companies operating in the oil and gas and import sectors. Guy has been married to his wife Amanda for 25 years and they have five beautiful children.


MORE THAN MAPPING

CUT THE TIME FROM STRATEGY TO SOLUTION

Make the most of what you have with Petrosys systems on a PPDM foundation. SOFTWARE

SERVICES

CONNECTIVITY

DATA MANAGEMENT

If you’ve been planning a data management strategy and need to speed up your delivery of a practical solution, then talk to Petrosys about our flexible web and desktop systems for E&P data management. Based on the PPDM data model, Petrosys dbMap® looks after your data without locking you in to long term proprietary solutions, and lets you pace the culture change associated with improved master data management to meet the budget realities of 2015. Learn more at www.petrosys.com.au/products-services/data-management.

D1005


Photo contest

Foundations photo contest

“BOW RIVER MEMORY” BY ELLEN WEST NODWELL 2nd Place in the 2Q2015 Foundations Photo Contest The most wonderful warm day before the September 2014 surprise storm…fishing in the Bow River! Bow River, Calgary, Alberta, Canada -- September 2014

22 | Journal of the Professional Petroleum Data Management Association


Photo contest

On the cover:

“EARLY MORNING MIST NEAR SUNWAPTA PASS” BY JOHN HEEREMA 1st Place in the 2Q2015 Foundations Photo Contest

My daughter and I were on our way to backpack the Skyline trail in Jasper Park. As we were approaching the ice fields at the top of Sunwapta Pass, I noticed that the early morning mist was clearing, looking south toward Lake Louise. --Lake Louise, Alberta, Canada This photo was created by “stitching” several lower resolution photos together into a single seamless image. It is available as a large-format gallery canvas. John has been an avid photographer since learning to develop film as a child, and has photographed people and landscapes around the world. John particularly enjoys self-propelled mountain sports such as skiing, climbing, and backpacking. You can reach him at: (403) 949-2028 / ppdm@andromeda.ab.ca or by visiting www.andromeda.ab.ca/photo

Enter your favourite photos online at photocontest.ppdm.org for a chance to be featured on the cover of our next issue of Foundations!

Foundations | 2Q2015 | 23


stratigraphic cross-sections and display annotations on, or adjacent to, the raster image. This may include formation tops, perforations, mechanical status of well, core intervals and description, drill stem test intervals and oil, gas and water shows and cumulative production volumes.

No More ‘Crasternation’ Raster Log Calibration: A New Solution to an Old Well Log Problem By Gord Cope, featuring (clockwise, top-right) David Baker, Jim Williams, and Trudy Curtis

W

ell logs are the foundation of geologic and engineering interpretation. There are a multitude of other data sources, from mud logs to seismic and core data, but maps and cross-sections are primarily based on well logs. Every facet of the E&P segment of the oil and gas industry relies on well data from logs. The original well log was a physical document, printed on paper, vellum or Mylar and stored with the operator or public agencies. Data vendors such as IHS, TGS and MJ Systems obtain the well logs and

create digital images, known as raster logs. Vendors perform depth calibration to provide a digital depth reference on the raster log image. This creates another file associated with the well log, commonly known as the depth calibration file, the primary role of which is to match pixels from the well log image with the depth the pixels represent in the wellbore. This file acts as a vehicle for the storage, maintenance and continuity of the raster image and its calibration through the lifecycle. It also gives the user the ability to prepare well-to-well structural and

24 | Journal of the Professional Petroleum Data Management Association

THE PROBLEM Each data vendor has their own depth calibration file format. IHS uses .XML (RasCal V2), for instance, and TGS uses .SIF. Several complications arise because different formats result in fundamentally different files. Without a standard file format, it becomes difficult to manage the images from an information technology perspective. The image calibrations become project-specific or user-specific and limit the productivity of the individual users, reuse and sharing. This issue is compounded when there are multiple software applications within an organization. Jim Williams is a reservoir engineer with much experience using raster logs. “Although we would like import/export functions to work flawlessly, the fact is they do not and the user is left with the task of either fixing the images in the ‘receiving’ software or importing the same images, recalibrating the faulty images and managing a duplicate copy of image calibrations in another software application.” For Williams, the issue came to a head several years ago when he was a Senior Petroleum Engineer at Kaiser-Francis Oil Co. “I was importing raster logs from


Feature

A raster log is the digitally-scanned image of a well log

PETRA to a new release of Geographix, and it was creating a different formatted file in which only about 90% of the data matched up,” he recalls. “In fact, if you used different steps to calibrate the raster image, you would end up with a different format in the calibration file. When it was exported, those changes were not accommodated in the receiving software application.” Although the problem has been around for years, industry professionals have relied upon a number of quick fixes. One common work-around is to manually perform the depth calibration. “Geologists and engineers perform this as a matter of course,” says Williams, “but there are several reasons why this is not a good solution. First, it takes a lot of time to perform, and secondly, the changes aren’t captured in the corporate database.” Sometimes, when time is of the essence, geoscientists skip the work-around altogether. If the mistakes are glaring, they will show up in a cross-section, but if they are not, operators may not even be aware there is a discrepancy. “With drilling and completions costing several million dollars per well, you don’t want to be chasing a prospect built on faulty data,” says Williams. Obviously, a permanent solution needed to be found, and Williams began discussions with other industry professionals. “When I talked to stakeholders, it became clear that every geologist and engineer understood the need for an open, standardized format for all raster log calibration, but nobody within the data vendor or software application communities seemed

motivated to tackle it,” he recalls. Williams’ concerns had resonance with others. “There have been various discussions and efforts over the last few years,” says David Baker, a Senior Geological Advisor at Chesapeake Energy Corporation, “but Jim’s approach had broad appeal.” “Oil companies liked the idea because it had the potential to reduce the nonproductive time that geologists and engineers spent checking the validity of the data,” says Williams. “Software vendors liked the idea because it would eliminate a lot of time on the phone doing support for their users when they ran into data problems. Data vendors could see that a standardized format would allow depth calibration errors to be permanently eliminated from their data sets.” In late 2012, Williams, Baker and others launched The Raster Log Calibration Project. Several companies soon began to contribute resources. “We looked at the various formats and decided that the IHS version had many of the necessary pieces to support modern WITMSL coding; it had the right fields and tags,” says Baker. “They very graciously donated their format to the project, and the project team set to work adding and modifying tags to come up with the final standard format to support the identified use-cases.” Chesapeake Energy, which participates in open standards database initiatives, recommended that PPDM and Energistics formally lead the project. “PPDM and Energistics were asked to become involved in the project for a number of reasons,” says PPDM CEO Trudy Curtis. “PPDM is

primarily focused on stewarding data at rest; that’s what we do with our data model. Energistics is focused on data in motion; the depth calibration file is essentially a transfer of data from one format to another. PPDM and Energistics had already done some work in the area of raster log calibration; PPDM built raster log calibration support into PPDM’s Version 3.7 and Energistic’s WITSML standard included support for calibration.” The project relied upon the volunteer participation of a wide range of subject matter experts in order to achieve the project’s goal. “I would like to mention the efforts of Zane Reynolds – IHS, Robert Best – Neuralog, Joey Magsipok Energistics and, of course, Trudy Curtis and Ingrid Kristel – PPDM, and Jana Schey – Energistics, for managing the project,” says Williams, “but there are so many amazing participants. I want to thank them all for their hard work in making this project a success.” Over the course of two years, the IHS format was subjected to a rigorous set of modifications in order to meet universal application. “We devised use-cases to see if it satisfied vendor and software application needs,” says Baker. “From the use-case, it didn’t quite meet everyone’s needs, so we added to it.” Phase I of the project has been completed. Using the IHS RasCal V2 as the building block for standards, participants have mapped the data components between the various vendor and software application formats and published the format into the public domain. “We have a standardized raster registration

Foundations | 2Q2015 | 25


Feature

format that is now WITSML compliant and can be accessed via a standard server communication protocol,” says Baker. “Let’s say you have digital information being generated on a rig. The information is commonly delivered in a WITSML format. We also have a WITSML format for raster log calibration that can be delivered using the same existing server specification.” “Now that we have a standardized format, there is so much more data we can add to the raster log calibration file,” says Curtis. “The next phase is to present the work to industry and get feedback from them; they have to tell us what they want and what they are willing to support.”

THE FUTURE The future holds much more potential. “Here’s the flaw in the system: raster registrations are stored locally, but the raster images are not,” says Baker. “So,

if I go to a vendor and retrieve a well for one project and make additions or corrections, my edits to the raster calibration are not available to my cohorts who are working on a different project but using the same well data.” According to Baker, the Holy Grail is to tie the master registration to the raster image in a manner that allows universal access to edited information. “I want to be able to point my application at a WITSML server and ask it to get the registration and raster images together so that I can fix or add information and put it back so that everyone can share it.” Baker believes that goal can be achieved. “The beauty of WITSML is that it is a web service that all applications can recognize. You don’t have to write special code to understand it or see what is behind it.” The goal, says Jim Williams, is worth the

Our brand new website is

About the Authors Jim Williams is a reservoir engineer and member of the Raster Log Calibration Project. E: jimb.williams@hotmail.com / Ph: 918-409-6711 Trudy Curtis is CEO of PPDM. David Baker is a Senior Geological Advisor at Chesapeake Energy Corporation. Gord Cope is an international energy correspondent and author. He recently released a travel memoir, A Paris Moment, in eBook form. www.gordoncope.com

LIVE!

> Log in and see our NEW EPIC website! > Customizable User Profiles

> Register Online and Track your Participation > Easier Navigation

Visit: ppdm.org 26 | Journal of the Professional Petroleum Data Management Association

effort. “I gladly participated in the Raster Log Calibration Project because it makes my job easier,” he notes. “That’s the value; it will make everybody’s job easier.”


Feature

Data Professionals Gain Many Kinds of Value from PPDM Training PPDM takes an in-depth look at the value and experience of its in-class training program By Gord Cope

F

or the last 28 years, RigData has been supplying information on drilling permits and rig-related data to thousands of companies in the O&G sector. In 2014, the firm decided to expand their business to include completions and production and installed PPDM’s data model in order to support the new service. “PPDM is the industry standard, and we decided to install it in order to make our data more consumable to our clients,” says Nicole Wilson, Data Acquisition Manager for

RigData. “We soon realized that it was a huge bear of a model – it’s not something you can just plug-and-play.” In order to get a better understanding of the functions and value of the PPDM Data Model, Wilson and other RigData employees enrolled in a series of courses being taught by PPDM. “I took four courses, and it all made so much more sense,” she recalls. “The instructor, Pat Rhynes, is an exceptional trainer.” “I think one of the most satisfying aspects of teaching is when someone gets

that ‘Aha!’ moment, when what they are learning clarifies an issue that they have been struggling with,” says Rhynes. “You know you’ve really made a difference.” Rhynes has been active in the oil patch for almost 40 years, working in many of the core disciplines for O&G companies around the world. Since 2011, the semiretired Subject Matter Expert has been teaching courses on behalf of PPDM. “We have two types of classes: public classes that are open to anyone and private sessions in which we tailor learning to the objectives

Foundations | 2Q2015 | 27


Feature

eLearning

Online Webinar

PPDM Training Private

Classroom Public

of the client,” he notes. Most of the training has occurred throughout North America, but PPDM has also held sessions in the Middle East, South America and Australia. The courses cover a wide selection of topics.

DATA MANAGEMENT CLASSES • Business Lifecycle of the Well. An excellent general overview of the business lifecycle of the well within the petroleum asset lifecycle, including planning, drilling, completing, producing and disposing. • Introduction to Master Data Management (MDM). Learn about master data management, effective data management strategies and the role that data management can play in an enterprise MDM solution. • Introduction to PPDM Association. Introduces professional petroleum data management and standards and the role that standards play in helping address major challenges facing the oil and gas industry. • What is a Well (WIAW). Takes you through the global differences and understanding of what exactly is a well.

• Well Status and Classification. Arms you with a comprehensive understanding of well status and classification related to petroleum data management and the PPDM.

PPDM DATA MODEL CLASSES: • Data Model Overview. Using a workshop approach, the class learns how to apply the many subject areas in the PPDM Data Model in a sample integration exercise. • Implementation Head Start. Helps you understand the architectural principles behind the PPDM Data Model. We review a practical approach to understanding and implementing the model. • Meta Model Management and Reference Tables. Reviews the value of the Meta Model and how to effectively use the model, as well as provide an overview of both standard and dynamic reference tables. • Organic Geochemistry. Previews the PPDM Data Model subject areas for organic geochemistry and sample management. • Seismic. Navigates through the

28 | Journal of the Professional Petroleum Data Management Association

complexities of seismic data and the nuances associated with storing data within the PPDM Data Model. • Well Data Management. Helps you understand the Well Model contained within the PPDM Data Model. This understanding of the primary table – WELL and header information is key to implementation of a well solution. We review all of the Well modules, emphasizing the main requirements for each module. • Well Logs. Learn how PPDM Data Model can be used to manage information about well logs and curves, operational logging details and how to catalogue what logging data is available. • Well Test and Production Volumes. Covers two specific groups of tables within the Well Module: Well Tests (both formation and production) and Production Volumes. • What’s New Data Model PPDM 3.9. Provides an overview of the changes in this new release: PPDM 3.9. These include key changes in 3.9 from the previous version, new subject areas, new code translation changes, deprecated areas, architectural


Feature

Pat Rhynes (outset) instructs a class in Calgary, Alberta, Canada. Summer, 2014 (inset) principles and compliance issues. PPDM also offers online training, including courses in the US Land Survey System, US Well Spotting, What is a Well (WIAW), Business Lifecycle of the Well, and PPDM Data Model Design Principles. Most courses last half a day. Discounts are available for members and group pricing. Online training courses offer additional discounts. Corporate response to the courses has been largely positive. “We have had some encouraging feedback from companies, especially for the non-technical sessions, such as WIAW and Business Lifecycle of a Well,” says Rhynes. “They tell us ‘everyone in our company should take these courses.’ Some of these companies have more than 500 employees, which raises logistical issues, but believe me, it’s a nice problem to have.” But the most satisfying feedback comes from the students. “One of the most valuable parts of the training sessions isn’t the information itself, but the fact that many different disciplines attend and learn more about what their co-workers do and how it affects them,” says Rhynes. The instructor recalls teaching a

Business Lifecycle class in which one of the students was from accounting. “She received technical information that she needed for tax returns, but did not know its source or relevance,” he notes. “She was quite excited to learn who was generating the information and its value. The end result was that the data technicians were able to provide better data that improved the accounting functions; it was a huge step forward for all concerned.” “Within our company, I recommend that anyone who is touching the data take the training,” says Wilson. “As for other companies looking to install the PPDM Data Model, taking some training first will make your job easier, from initial mapping to final roll-out.” “One of the responses I hear is ‘We wish we had taken this class last year, as it would have saved us a lot of problems’.” says Rhynes. “Even better is when they say ‘We’re just starting to set up a project and this is going to help us do a better job’.” In the future, PPDM is looking to offer the classes on a more regular basis and in more international locations. In the meantime, Rhynes is happy to be making a positive difference in the professional lives

of data professionals around the world. “I have been blessed with the opportunity to work in the O&G sector, to meet wonderful people and learn so many skills,” says Rhynes. “Conducting the training sessions has been a tremendous opportunity for me to give some of that back. If there are people reading this who feel the same way, then I recommend they contact PPDM to find out how they can contribute; I can guarantee you that it is greatly satisfying to pass on valuable information and experience to younger generations.” To learn more about PPDM courses and our 2015 public training schedule, please visit our new website at PPDM.org or contact the training coordinator at training@ppdm.org.

About the Author Gord Cope is an international energy correspondent and author. He recently released a travel memoir, A Paris Moment, in eBook form. www.gordoncope.com

Foundations | 2Q2015 | 29


Upcoming events

UE

WORKSHOPS

SYMPOSIUMS AUGUST 5–6, 2015 PERTH DATA MANAGEMENT SYMPOSIUM

Brisbane Data Management Workshop

Perth, WA, Australia

Oklahoma City Data Management Workshop

JULY 30, 2015 Brisbane, QLD, Australia

AUGUST 11, 2015 Oklahoma City, OK, USA

OCTOBER 19–21, 2015 CALGARY DATA MANAGEMENT SYMPOSIUM, TRADESHOW & AGM Calgary, AB, Canada

LUNCHEONS AUGUST 18, 2015 TULSA Q3 DATA MANAGEMENT LUNCHEON Tulsa, OK, USA.

JULY 2015 S M T

AUGUST 20, 2015 DALLAS/FORT WORTH Q3 DATA MANAGEMENT LUNCHEON Dallas/Fort Worth, TX, USA

W

T

F

S

7 14 21 28

1 8 15 22 29

2 9 16 23 30

3 10 17 24 31

4 11 18 25

AUGUST 2015 S M T

W

T

F

S

5 12 19 26

6 13 20 27

7 14 21 28

1 8 15 22 29

T

F

S

Denver, CO, USA

3 10 17 24

4 11 18 25

5 12 19 26

SEPTEMBER 17, 2015 HOUSTON Q3 DATA MANAGEMENT LUNCHEON

5 12 19 26

2 9 16 23 30

6 13 20 27

3 10 17 24 31

4 11 18 25

SEPTEMBER 2015 S M T W 6 13 20 27

7 14 21 28

1 8 15 22 29

2 9 16 23 30

AUGUST 25, 2015 MIDLAND Q3 DATA MANAGEMENT LUNCHEON Midland, TX USA

SEPTEMBER 9, 2015 CALGARY Q3 DATA MANAGEMENT LUNCHEON Calgary, AB, Canada

SEPTEMBER 15, 2015 DENVER Q3 DATA MANAGEMENT LUNCHEON

Houston, TX, USA

UPCOMING PUBLIC TRAINING SESSIONS PPDM recently announced that it is supporting its membership by DISCOUNTING ONLINE TRAINING COURSES! Online training courses are available year round and are ideal for individuals looking to learn at their own pace. For an in-class experience, private training is now booking for 2015-2016. Public training sessions will resume in the third quarter of 2015.

All dates subject to change.

VISIT PPDM.ORG FOR MORE INFORMATION Find us on Facebook Follow @PPDMAssociation on Twitter Join PPDM on LinkedIn

30 | Journal of the Professional Petroleum Data Management Association


A

Write the Be the

rticles uthors > Letters to the Editor > Guest editorials > Technical papers and much more!

Interested in being published in Foundations? Visit ppdm.org/publications to find out how!

ARE YOU CERTIFIED? SHOW THE INDUSTRY THAT YOU POSSESS THE SKILLS TO NAVIGATE THE UNIQUE CHALLENGES OF DATA MANAGEMENT IN THE E&P COMMUNITY.

BECOME CERTIFIED NOW AS A

CERTIFIED PETROLEUM DATA ANALYST PPDM.ORG/CERTIFICATION

Join the action!


Warning: Our data has gone mobile

Now, get geoLOGIC ’s value-added data almost any place, any time, any way you want it. Available through gDCweb on your tablet, smartphone or computer. With 30 years of data experience behind it, gDC is the source for high quality, value-added well and land data from across Western Canada and the Northern United States. Another plus – our data is accessible through an expanding range of industry software utilizing our own easy-to-use gDC GIS and our geoSCOUT software. View, search, import and export well, land and production data, documents, logs and more from almost anywhere. For more information visit our website at www.geoLOGIC.com

Leading the way with customer-driven data, integrated software and services for your upstream decision-making needs. geoSCOUT | gDC | petroCUBE at www.geoLOGIC.com


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.