Every company is talking about AI and ML technologies nowadays and each IT company be it a start-up or super majors are having specialized offerings around AI and ML technologies. However, the extent and complexity these new age technologies shall address varies from each individual companies’ outlook. Over the years from SAS interactions with the industry experts, the inherent expectations from the oil and gas business applications is to build an error free application which requires good amount of feature engineering considering multiple variable factors that affect the expected output.
This requires large amount of interaction between the business domain and technology domain. The question remains (a) Do we have the right kind of team from each domain that are well verse with the boundaries within which the solutions need to be developed, (b) Does the variety and veracity of data considered for the error free solutions available or should we first focus on data availability.
We always say oil and gas industry is deeply interconnected right from the exploration, drilling and production of oil and gas to transportation, refining and distribution of oil. Therefore, large data lake projects can help in data exchanges or data collection and management across the value chain. The question to ponder on is – Has this been achieved yet to forecast, predict, and optimize the business variables required for analytics at each stage. We have been discussing this interconnection of “Cradle” to “Grave” from quite some time, but the question remains till what granularity we have achieved this interconnection OR is it advisable to relax the boundaries and break the issues into small categories such as Compliance, Emissions, supply-demand fluctuations, price volatility, material availability, energy efficiencies, production efficiencies, Supply chain etc.
We all realize a wrong decision is very costly, but a correct decision is highly valuable. Therefore, it is imperative that emerging technologies become RESPONSIBLE by collaborating to improve operations by reducing costs, and increasing safety, efficiency, and speed of the processes. Over the last 2 years, digitization within oil and gas has improved to bring all the related data on a single platform which can detect anomalies in movements, equipment vibrations, leakages, corrosion, and others. But has the confidence on data availability and analysis improved to convert field/market data into actionable data within oil and gas companies to help get valuable insights?
Is it that we are overwhelmed with the data and the biases on the data play an important role to arrive at the complex business decisions? What is the next level of maturity we are talking about? We all see that the manufacturing processes are challenging themselves and achieving next levels of maturity either in terms of global parameters such as environment and sustainability or tactical levels in terms of performance efficiencies, but have we achieved solutions maturity from low-risk category to trustworthy category yet? We all talk about root cause analysis, back casting, and other analytical terminologies but there are very few companies who have adopted these processes in development of analytical solutions and learn from each project to arrive at benefits in their next engagements.
Why not develop a process that talks about failures and the key failure parameters before the start of each engagement so that we can confidently state that “With AI and ML, the action is not learned; it is pre-determined.
The key failure parameters shall serve as a catalogue for critical check points as well as a head-start for similar projects or other first-of-a-kind (FOAK) to remain on track and not commit failure or mistakes carried out in the referenced project/s from which key failure parameters have been listed out.