Hortonworks はリーダーです。Forrester Wave をお読みくださいレポートをダウンロード
Assets everywhere. People everywhere. Logistics everywhere. The petroleum industry has a lot of moving parts, and pretty much every aspect of it is in constant flux. Like other industries, its infrastructure generates data of all kinds—sensor data from upstream, midstream, and downstream operations, geological and geophysical, drilling and completions data, geolocation, text files, video and more. Hortonworks enables provides IoT and predictive big data analytics for oil and gas, delivering the predictive analysis and data insights to optimize performance to keep this industry humming.
Large, complex datasets and rigid data models limit the pace of innovation for exploration and production, because they require petrophysicists and geoscientists to work with siloed, complex datasets that require a manual quality control (QC) process. LAS log analytics with HDP big data analytics for oil and gas allows scientists to ingest and query their disparate LAS data for use in predictive models. They can do this while leveraging existing statistical tools such as SAS or R to build new models and then rapidly iterate them with billions of measurements. Combining LAS data with production, lease, and treatment data can increase production and margins. Dynamic well logs normalize and merge 100s or 1000s of LAS files, providing a single view of well log curves, presented as new LAS files or images. With HDP, those consolidated logs also include much of the sensor data that used to be “out of normal range” because of anomalous readings from power spikes, calibration errors, and other exceptions. With HDP, an automated QC process can ingest all the data (good and bad) then scrub it to eliminate the anomalous readings and present a clear, single view of the data.
After identifying the ideal operating parameters (e.g. pump rates or fluid temperatures) that produce oil and gas at the highest margins, that information can go into a set point playbook. Maintaining the best set points for a well in real-time is a job for Apache Storm’s fault-tolerant, real-time oil and gas predictive analytics and alerts. Storm running in Hadoop can monitor variables like pump pressures, RPMs, flow rates, and temperatures, and then take corrective action if any of these set points deviate from pre-determined ranges. This data-rich framework helps the well operator save money and adjust operations as conditions change.
石油ガス会社は、政府や民間の土地を探査して掘削する権利を得るために、複数年にわたるリース入札をします。リースのために支払った価格は、この先あるかどうかわからない炭化水素の流れにたどり着くために、今必要なコストなのです。油井の貸手は、将来的な利益の不確実性を低減し、より正確に油井の歩留まりを予測することで、競争相手に打ち勝つことができます。Apache Hadoop は、画像ファイル、センサーデータ、地震観測値を効率的に保管することで、この競争力を提供することができます。これは、入札対象となっている油井に対して、サードパーティによる調査では得られない背景を付加します。予測分析を含む独自の情報を持つ会社こそが、進めようと考えていたリースを取りやめたり、「ダイヤモンドの原石」を見つけて割引価格でリースしたりすることができるのです。
Traditionally, operators gathered data on the status of pumps and wells through physical inspections (often in remote locations). This meant that inspection data was sparse and difficult to access, particularly considering the high value of the equipment in question and the potential health and safety impacts of accidents. Now, oil and gas IoT sensor data can stream into Hadoop from pumps, wells and other equipment much more frequently—and at lower cost—than collecting the same data manually. This helps guide skilled workers to do what sensors cannot: repair or replace machines. The machine data can be enriched with other data streams on weather, seismic activity or social media sentiment, to paint a more complete picture of what’s happening in the field. Algorithms then parse that large, multifaceted data set in Hadoop to discover subtle patterns and compare expected with actual outcomes. Did a piece of equipment fail sooner than expected, and if so, what similar gear might be at risk of doing the same? Data-driven, preventative upkeep keeps equipment running with less risk of accident and lower maintenance costs.
Oil companies need to manage the decline in production from their existing wells, since new discoveries are harder and harder to come by. Decline Curve Analysis (DCA) uses past production from a well to estimate future output. However, historic data usually shows constant production rates, whereas a well’s decline towards the end of its life follows a non-linear pattern—it usually declines more quickly as it depletes. When it comes to a well near the end of its life, past is not prologue. Production parameter optimization is intelligent management of the parameters that maximize a well’s useful life, such as pressures, flow rates, and thermal characteristics of injected fluid mixtures. Machine learning algorithms can analyze massive volumes of sensor data from multiple wells to determine the best combination of these controllable parameters. HDP’s powerful capabilities for data discovery and subsequent big data analytics for oil and gas analysis can help the well’s owner or lessee make the most of that resource.