Académique Documents
Professionnel Documents
Culture Documents
Abstract
During the design and construction phases of a pipeline project, there’s a unique opportunity to
capture highly-valuable information from contractors and vendors. If quality-controlled,
managed effectively and disseminated correctly, this information will deliver significant
efficiency, cost and integrity benefits, both during the project phase and into operations. Some
of the key aspects in effectively turning data into valuable information are presented, illustrated
by lessons learned from several large-scale onshore and offshore projects.
1. Introduction
A plethora of data is generated during design and construction phases of any project. If
properly accessible, organized and reliable, this data can provide a wealth of information to all
the parties involved in design, construction, and after that, those involved in operations,
monitoring & maintenance, and eventual repair, modifications and decommissioning.
The presentation will go through and discuss experience and lessons learnt on a number
of pipeline projects, both subsea and on-land. This knowledge sharing is intended to help future
projects in planning for success – which in this context consists in bringing improved safety
and improved efficiency, at a lower cost. As in any other project-like activity, early planning
and design will lay the foundations for successful outcomes.
The following sections of this extended abstract briefly outline the key topics to be
covered.
One key aspect for a successful process of turning data into value information is early
engaging with the client, contractors and vendors. The earlier this process commences, lesser
______________________________
1
M.Sc., Geographer – Wood, USA
2
D.Sc., Civil Engineer – Wood, Brazil
Rio Pipeline Conference and Exhibition 2019
the effort. Having the information quality-controlled and effectively managed as it is being
generated is far less energy-consuming then doing it after is has been recorded in a non-ideal
way.
Having the client buying into the data management process, and imposing simple
requirements to contractor and vendors in the way they record their generated information can
bring significant cost savings into this process. Client engagement should include project
managers, engineers, and people in procurement, information management, commissioning,
and most importantly: operations. The operations personnel will be the end-user for the longest,
over the entire operating life of the asset.
The project team should also have champions, to promote and support the process, and
users which will operate the information management system. These should also be identified
and engaged as early in the project as possible.
Contractors, vendors and suppliers should also be engaged early, so as to consider any
additional information management requirement from the beginning of their process too. Such
additional requirements are often very simple to implement if considered from the beginning.
Specifications for gathering and managing data should be clear and simple. These
should include the data dictionary, clear indications on which units are to be used, and examples
illustrating the usage to facilitate quick and effective understanding. Simple data templates, that
clearly define the information required from each contractor and supplier scope, should be
generated and shared with the third parties involved to improve the quality of received data.
Table 1 illustrates one of such templates, for pipe coating data.
The requirements should be set up in such way that all the “what”, “when”, “who”,
“how” and “why” are suitably recorded. It should also be as simple as possible, avoiding
unnecessary aspects and complications such as domains, data types and character limits.
Table 1. Example of clear and simple requirements for pipe coating data gathering
2
Rio Pipeline Conference and Exhibition 2019
The presentation will also discuss the importance of robust data management tools that
enable data managers to focus on completeness and quality rather than the underlying data
model. These should provide visibility of received data and loading progress, and have built-in
tools to ensure the data is validated prior to loading. The management tool should provide the
ability to update and unload, without compromising database integrity, and allow management
and synchronization of data between multiple environments, enabling quality checks to be
carried out prior to release of information. These should also produce automatic records to make
the process auditable, from receipt through to release. Figure 2 shows the data management tool
developed by Wood, which is an integral part of its Nexus GIS system.
Pipeline Open Data Standard (PODS, ref. www.pods.org) model extensions are used to
support and enable cross-validation of the large volumes of non-spatial data collected during
the project, whilst streamlining the process of developing the as-built model.
Large volumes of non-spatial data are captured during the manufacturing, fabrication
and installation phases. Typically, this is loaded and related in PODS once the physical location
is known, however it contains a lot of high-value information for the project even before that.
In response, Wood have created a new sub-model, based on the sequence of events during the
project and the information requirements of the wider project team.
4. Acknowledgements
The authors would like to thank Wood for permitting and supporting this presentation.