Sign up to get full access to all our latest content, reports, webinars, and online events.

Developing Industry-Recommended Practices for Effective Emissions Management

Add bookmark

operator recording operation of oil and gas process at oil

In a rapidly changing world where technology and regulations intersect, effective emissions management and reliable data remains at the forefront of industry concerns. Making significant strides in shaping the future of energy and emissions management are Cristina Lopez, Methane Emissions and Renewable Gases Process Research Engineer at GRTgaz in France, and Jay Thanki, Decarbonization Systems Engineer at TC Energy in Canada. 

At a recent online event held by the Industrial Decarbonization Network, a panel discussion on ‘Developing Industry-Recommended Practices for Automating Emissions Measurement and Monitoring’, moderated by Liz Arthur, Senior Director of Growth at Project Canary, Jay and Cristina shared expert perspectives on the pursuit of effective emissions management. Read as they reveal best practices and approaches on navigating the complex landscape of data, compliance, and quantification. 

Metrics That Matter

Jay Thanki: The two metrics that I really consider when evaluating new types of technology, whether they are OTT physical devices or IT software and platform applications, are scalability and compatibility. 

The closer we can get to having a single metric to gauge our performance and assess the company as a whole, the more practical it becomes. So, when examining emissions data, I search for data pathways and data streams within the organization that can entertain scaling. This is especially crucial in the energy industry because, historically, oil and gas has been somewhat slow in adopting and utilizing technology compared to more advanced sectors.

I'm always on the lookout for opportunities to scale, and when I talk about compatibility, I'm referring to the vast array of available technologies. The key is to identify the right areas within  our asset base or the digital landscape where these technologies can support us on our journey to consolidate our emissions data or any other data, for that matter.

READ: Investor Insights: Best Practice for Emissions Reporting and Mitigation

Cristina Lopez: About 10 years ago, our primary focus in our lab was accuracy. Our work involved testing detectors to determine which one provided the most accurate measurement of methane concentration. However, with the impending regulatory changes, our current priority has shifted from scalability to accurate quantification.

This shift is driven by the fact that current regulations require us to integrate various technologies, including bottom-up approaches, top-down methodologies for site-level emissions, and continuous monitoring. It is challenging to obtain consistent numbers across these diverse technologies.

Each technology serves a different purpose... Handheld detectors aim for the highest accuracy, providing a snapshot of leak activity during specific moments because leak flow rates can vary throughout the day. On the other hand, top-down measurements offer a broader perspective in terms of time and scale, but their quantification might be less precise.

Therefore, it is crucial to incorporate scalability by deploying continuous monitoring devices across our assets so it can help reconcile the diverse data sources. Plus, as we work towards harmonizing the various datasets to provide a comprehensive picture, it's important to address the issue of uncertainty.

Approaching Standards and Regulations

Cristina Lopez: The upcoming regulation will require European operators to conduct a minimum of two campaigns using detectors, including handheld cameras or remote laser detectors. The idea is to detect the biggest leaks in your assets. This means you'll need devices capable of detecting leaks with a limit of at least 17 grams per hour. Additionally, they are calling for more accurate campaigns, with at least one per year using sniffer devices.

This has sparked a debate in Europe, as the appropriate detection limit can vary depending on the nature of your assets. Some assets, particularly those with mainly underground pipelines, may have detection limits that are significantly lower in reality.

So, there's an ongoing discussion on how to reach a consensus, given that it can be more challenging for underground operators to gather accurate data for these campaigns. It's essential to note that, in Europe, this is still in the debate phase, and nothing is set in stone.

As for our approach at present, we are testing detectors and acoustic cameras in controlled lab conditions. Once we're confident that these detectors perform well under controlled release conditions, we move to real assets to conduct the campaigns. When testing a new detector, we also carry a known one with us for comparison. This practice helps us maintain accuracy and proves our commitment to complying with upcoming regulation.

Additionally, OGMP 2.0 is calling for the implementation of drones or vehicle-mounted devices to monitor emissions at the site level. So, within the Consortium Group of European Research for Gas, we've conducted a benchmark of all the technologies currently available in the market. Similar to our lab tests, we also conduct controlled release tests to assess accuracy. Once we have identified technologies that are better suited for specific types of assets, we can recommend them to our clients. For instance, if you have an underground gas storage facility, we might suggest using drones, as accessing underground areas with a vehicle can be more challenging.

Our current focus is on reconciling the various data sources and to achieve this, we are developing the best methodology for each type of asset, conducting uncertainty studies, and planning how to implement this data into our future strategies to monitor asset evolution.

LISTEN: Methane Talks, Episode 4: From Data to Action: Monitoring Methane Emissions with Project Canary

Jay Thanki: At TC Energy, we have a team of subject matter experts and specialists who are dedicated to studying the regulatory landscape for our various businesses. It's important to note that TC Energy is not just a single gas pipeline company; we encompass around 25 to 30 distinct businesses that all feed into one interconnected gas pipeline network. Some of these are physically connected, while others are not.

My primary focus is delving into the unique journey and data maturity of each of these businesses, examining their asset hierarchies, and understanding the similarities between their physical assets.

Our objective is to invest time and effort in evaluating the capabilities of our facilities, be it through process improvements or the implementation of technology, to ensure that data flows smoothly where it's needed. The more we understand and enhance the capacity of our facilities in this regard, the more valuable it becomes to our entire enterprise. This positions us well to meet any regulatory requirements, regardless of their level of stringency. Much of this work resembles a data engineering exercise, given the thousands of facilities we have generating data, which then flows through different channels and hands within our organization before being reported to governing bodies and regulators.

It is challenging to be an expert in every aspect of our industry, from operational technology and information technology to the rapidly evolving tech landscape and regulations. So, my approach is to break it down to mathematical and scientific aspects. This way, regardless of where regulations may lead, the data will remain valuable and help us inform our decision making. 

To me, it's a fact-finding exercise, working closely with subject matter experts who closely monitor the regulatory landscape. As they say the requirements are changing, I offer insights into how close or far we are from meeting those requirements, particularly in the operational technology domain. 

Quantifying Emissions

Cristina Lopez: Regarding quantification, when it comes to site-level or top-down solutions, it is not the most accurate at the moment. Comparing them to other quantification methods, such as the back-end approach that relies on directly measuring the flow, top-down methodologies rely on different algorithms like Gaussian plume dispersion algorithms or mass balance algorithms, which depend on multiple data sources, including GPS data. Integrating all of this data into a single dataset can be quite challenging. This is the reason that, in general, in Europe, these methodologies tend to overestimate emissions, at least in the midstream assets.

Currently, due to OGMP 2.0, we are still using these quantification methodologies, but we are exploring new approaches. Some companies don't disclose the specific algorithms they use, but in one campaign we deployed at an underground gas storage site with a drone, we noticed that the quantification results were very close to those obtained using the bagging method. This was likely due to the use of the Gaussian plume dispersion algorithm, as the wellheads in these underground facilities resemble chimneys with similar geometry.

However, we noticed that in the main areas with a lot of equipment, the geometry was not well-suited for this algorithm, which resulted in inaccurate quantification. To address this, we performed direct measurements using a device we developed. The method involves setting the flow rate based on the concentration of the leak, which is measured beforehand. Then, after aspirating the leak, we measure the concentration of the sample and use correlation to determine the leak size. This approach has shown good agreement with results obtained from the bottom-up method.

Currently, we recommend the bottom-up method, like the bagging method or high-flow sampling devices, for midstream operators because of its accuracy. However, it's time-consuming. So, while we still advocate for its use, it's worth implementing other devices to capture different leak tendencies. This is because the measurements represent a specific moment in the existence of the leak, and it's uncertain if the leakage rate will be the same later in the day. Plus, for the main areas of your midstream asset, direct measurements are highly recommended for quantification.

READ: The Orphaned Wells Program: Understanding the Funding Allocation, Challenges & Environmental Impact

Jay Thanki: Quantification is incredibly important for accurately measuring emissions. The precision and accuracy of your calculated emissions are dependent on the quality of the inputs you provide. Whether you're working with algorithms, machine learning, or big data science, the information you feed into the system plays an important role.

In essence, we're performing large-scale experiments and need to ensure that the outputs match our intended results. Much of the necessary information about our assets can be found in the engineered drawings. The challenge is locating these drawings, whether they're stored in the cloud or on-site. There are options available, such as conducting surveys or laser scanning, to gain a deeper understanding of the site.

Another valuable data source is Supervisory Control and Data Acquisition (SCADA), which provides real-time telemetry data from our facilities, offering insights into their operations, including factors like pressure, temperature, and line conditions. It can be a complex task to uncover the specific questions we need to ask from our data systems.

When exploring these multidisciplinary and cross-domain questions related to input data, it's essential to engage with experts like controls engineers, individuals responsible for managing drawings and databases, or the drafting department. It's a process of extracting relevant information, and while it can be challenging, the required data is available on the input side; we just need to find it.

At the end of the day, we're solving for gas properties, following the ideal gas law (PV=nRT). The inputs we provide determine whether the outputs are realistic or not. Finding the right technologies and accessing the right information is key. This can be a frustrating endeavor, especially when dealing with diverse facilities due to acquisitions, each with its own data and engineering standards. The solution lies in developing universal terminologies and language that allow different teams to collaborate efficiently, ask the right questions, and obtain accurate algorithm outputs. 

The Learning Curve

Cristina Lopez: One significant lesson is that three years ago, we had no knowledge of top-down methodologies. So, when we initially began testing them, we discovered that some of these methodologies didn't work for certain assets. We had initially read articles from the United States suggesting that a specific technology, let's call it 'A drone,' was highly effective. However, when we tested it in Europe it didn't perform as well due to differences in asset geometry and other factors. This experience taught me a crucial lesson: the accuracy of data for one asset does not necessarily guarantee the same accuracy for another asset or client. It underscores the importance of adapting your strategy to specific contexts.

When it comes to methane mitigation, collaboration is key, and I believe it's the path to success. Communicating and working closely with other operators in the market is vital, especially in the world of emissions management.

It is also important to stay updated on emerging technologies. New solutions are introduced regularly, and it's worthwhile to conduct ongoing benchmark assessments. Gradually integrating additional technologies into your strategy can significantly enhance the accuracy of your data in the long run.

LISTEN: Methane Talks, Episode 3: GHGSat on the Importance of Remote Methane Monitoring

Jay Thanki: When it comes to lessons learned, two come to mind, especially because these two have a way of resurfacing frequently. 

First is, don’t reinvent the wheel. Our facilities have been operating reliably for over 70 years. We excel in delivering energy, and our asset base has been meticulously engineered by dedicated professionals over the years. We are not the only ones responsible for building pipelines; it's a collaborative effort within the industry. So, when I say, 'don't reinvent the wheel,' it essentially means to not reengineer. 

There are a lot of technologies and efficient practices already ingrained in our assets function, often hidden in industry standards. It's easy to overlook these because when things work you almost become immune to their effectiveness. Recently, I've been fascinated by reading older engineering standards, like ISA 90. These standards provide a clear hierarchy for information flow within our components and assets, leading to our historian and, subsequently, the ERP system. When you're dealing with large companies with different business units that operate slightly differently, it's essential to follow this chain of information flow. 

As exciting as it is to innovate and build something new, I've often found that my ideas are simply derivatives of established solutions. It's more practical to leverage proven methodologies, data exchange standards, or process improvements that have already demonstrated their effectiveness. In the world of engineering, it's essential to remember that most things have already been invented, which helps keep my enthusiasm in check, given that we have many talented people working within their own domains.

The second lesson I’ve learnt is that more data isn't necessarily better. This ties back to the first lesson of not overengineering. There is a good chance that the data you need is already within your grasp, the key is finding it. Our assets have been engineered over multiple decades, and someone, at some point, established a data framework or process meant to last for generations. There is a lot of value in unearthing these institutional truths and revealing a wealth of valuable information that can drive data-driven decisions. 

It's also possible that we're still sitting on a piece of 1970s technology that hasn't been revisited in years. The industry is currently experiencing a significant wave of digitalization, making it an exciting time to be part of this transformative journey. 

READ: 5 Updates in This Week's Methane News
 
Interested in learning more? 
The ‘Developing Industry-Recommended Practices for Automating Emissions Measurement and Monitoring’ panel delves deeper into best-practice approaches to implement, measure and maintain automated emissions monitoring systems. Watch the full video to understand how you can better develop and implement robust detection, quantification, and verification processes within your operations.  
 


RECOMMENDED