[ad_1]
The oil industry has installed fiber optic sensor cables in wells to better understand why oil trapped by hydraulic fracturing is not being released into shale deposits at the expected rates, but the massive information flows are difficult to analyze. Now, a team of researchers from Texas A&M University and the Colorado School of Mines has developed an algorithm to clean up this underground data and give a clear view of how and where these fracturing processes succeed and fail.
“Our quantitative characterization provides more information about the fracture geometries within a deposit than a simple qualitative analysis,” said Dr. Kan Wu, Associate Professor and Chevron Corporation Faculty Fellow in the Harold Vance Department of Petroleum Engineering. “We have tested our algorithm and have already used it in the field.”
The results were published in the Society of Petroleum Engineers. SPE production & operation Diary.
Traditional data interpretation methods, while incredibly helpful to engineers, rely solely on qualitative information or probabilities based on information patterns. In contrast, the algorithm was developed to collect quantitative data that can be counted, such as changes in temperature, pressure or rock deformation within a reservoir. It recognizes the results that occurred to produce the changes and models exactly how far and quickly the fractures moved, which directions they went, and how big they got.
Data acquisition with low-frequency distributed acoustic sensing (DAS) has only been around for five years, so not all of the information from the boreholes has been fully deciphered with fiber optics. In addition, each borehole has its own characteristics due to the enormous variations in underground structures. This complexity is why Wu and her colleagues, Dr. George Moridis, Professor and Robert L. Whiting Chair, and Dr. Ge Jin, Assistant Professor of Geophysics at Mines, spent a lot of time meticulously developing their algorithm.
First, the researchers tested the algorithm’s ability to cleanse the data and interpret simple flows from known fracture processes. That way, they could backtrack or reverse the information to find the starting point for a fracture to grow. As the algorithm expanded to understand more complex information, they improved its ability to think ahead and predict how new and complex fractures arise and grow.
Wu is an expert in rock mechanics, Jin is an expert in geophysics and DAS technology, and Moridis is an expert in advanced numerical methods and high-performance computing coupled processes. Due to the multidisciplinary background of the project team, the algorithm has incredible flexibility to grow and adapt to the type of data received. Yongzan Liu, who has been a PhD student on the project for over two years, is now a postdoctoral researcher using similar methods and modeling on fiber optic data from hydrated sediments to monitor natural gas production for the Lawrence Berkeley National Laboratory.
Liu, Wu, Moridis, and Jin are the first to develop this type of algorithm and publish results. The goal of her research is to ultimately automate the algorithm so that feedback of fracturing events occurs in near real time at a drilling site. This allows engineers to quickly tailor the fracture design work to the specific composition of each borehole.
“The industry needs this type of tool to understand fracture geometry and monitor fracture propagation,” said Wu. “The more efficient it gets, the better it will help optimize hydraulic fracture and completion designs and maximize well production.”
– This press release was originally posted on the Texas A&M University website
[ad_2]