Optimizing Hydraulic Fracturing: Discrete Fracture Network (DFN) vs. Statistical Analytics

Optimizing Hydraulic Fracturing

Determining the Optimal Fracture Method & Formula

Optimizing your hydraulic fracturing, based upon the geology of your field, area and even specific well is the goal of every oil & gas operating company. Our focus is on the methods of optimization in use today and how we can improve on these methods. By improving upon the methods of fracture optimization, we enable operators to rapidly improve well efficiency and production.

 

The optimal method and formula used in hydraulic fracturing will vary from field to field, area to area and even well to well. There simply is no universal optimal method or formula for fracturing a well. There are many factors that influence the optimal method, such as the permeability, porosity, brittleness, stress field, natural fracture network, reservoir pressure and temperature, fluid saturations, proximity to water zones, etc.

 

The two most prevalent approaches to fracture optimization in use today are summarized by the words “modeling” and “experiential”. We believe that the experiential approach is most effective, but it can be dramatically enhanced when guided by the statistical insights provided by a powerful tool that is designed for this specific purpose.

The Fracture Modeling Approach to Fracture Optimization

Most Fracture modeling programs today attempt to fit conventional fracture theory and equations to unconventional fields.  Best-in-class fracture modeling programs are based upon the assumption that the fracture process results in discrete fracture networks that can be mathematically modeled. Discrete Fracture Network (DFN) modeling combines fracture pattern analysis—derived by evaluating outcrops and cores with mathematical formulas, to model or visualize the impact of fracturing methods applied about 7,000 feet below the surface. This approach is based upon theory and the belief that fracturing is largely symmetric. However, in our experience and the experience of experts we have consulted with, the fracturing process is inherently chaotic. Such a chaotic process simply cannot be modeled. While modeling make us feel we are able to control the process, the real-world results demonstrate otherwise.  These additional features can in some cases do more harm than good.  It is nearly impossible to determine accurate fracture density and aperture which are both needed for DFN modeling.  Since these variables are soft, they are often used by engineers to adjust the model to fit their personal bias, effectively goal-seeking the result.  This can be detrimental to the success of a field development.

The Experiential Approach to Fracture Optimization

“The difference between theory and practice: in theory there is no difference, but in practice there is.”

 

Fracture modeling has not borne real-world fruit because, as the quote above states, in practice there is a difference between theory and practice. The fracturing process is one of fracturing into heterogeneous rock formations resulting in chaotic fracturing that cannot be modeled. In fact, an operator’s recognition that a reservoir has widespread heterogeneous rock properties is  the original rationale behind horizontal drilling.  The point that a horizontal well is necessary for a field should, by itself, be sufficient evidence that software modeling is very unlikely to provide any predictive value whatsoever.   For this reason, the more efficient approach to fracture optimization has been based on operator experience. Answering questions like “What has worked in this type of geology before?”, “What has worked in this formation for other wells?” Finding answers to these questions has proven more effective than modeling.

Limits to the Experiential Approach to Fracture Optimization

Coupling the experiential approach with iterative trial-and-error has been the most effective real-world method of determining the optimal fracture method and formula. However, this approach has certain limitations.

  1. Variables: A human, unassisted by technology, can only process a limited number of variables at one time. The current state of the art tool is spreadsheets which capture data, but are sub-optimal at managing the number of variables and extracting information from the noise of the data.

  2. Volume: The volume of data created from the fracturing process is quite large. Adding data across wells or from historical wells in the area and then deriving knowledge from this extreme volume of information is very difficult.

  3. Velocity: The rate at which data is generated in the drilling and stimulation process is quite high. Being able to assimilate this data and make cogent decisions in real-time is quite challenging.

  4. Value: Some of the well data is captured and maintained in a hodgepodge of Excel spreadsheets by individual engineers. Lack of a single company- or industry-wide standard methodology and format, results in siloes of value trapped and owned by individual engineers. The ideal would be a solution that standardizes and shares the data across a company, enabling all engineers to extract value from the entire data set. This has the secondary benefit of turning that data into a corporate asset, not one that goes home every night.

Statistical Enhancement of the Experiential Approach of Fracture Optimization

The fracturing process is the only time engineers get to interact with the reservoir. In this interaction, the reservoir speaks to us in the language of data. Humans, unassisted, are unable to process all of this data and extract trends and correlations. This causes us to cherry pick a few high-level Key Performance Indicators (KPIs) and to use these as the sole variables in determining the optimal fracture method and formula.

 

An excellent analogy is the book/movie Moneyball. Coaches were judging baseball talent based upon a few quantitative measurements like ERA, batting average and stolen bases in combination with qualitative observations like musculature and how good the player was with the ladies. However, rigorous statistical analysis uncovered the true determinants of success, primarily On Base Percentage (OBP) and Slugging Percentage (SLG).

 

In response to the success of the A’s, who employed this statistically-driven approach, all baseball teams have realized that their traditional method of evaluating talent was subjective and flawed and they have all moved to a rigorous statistical approach. DeepData now brings this approach to the process of fracture optimization.

 

The first step is acknowledging the failures of the past. Fracture modeling simply does not apply to a process that is inherently chaotic. Experience provides some guidance, but it is limited by a lack of effective tools to address the 4 V’s of fracture data: variables, volume, velocity and value. This is the genesis for DeepData; we hope you’ll try it.

 

 




© Copyright 2017 - DeepData, Inc.  All Rights Reserved.