In Petro.ai 5 business workflows like building a decline model for a group of wells can be automated using the API. In this example we will demonstrate how to,
- Create a Petron
- Create a Group
- Add Wells to a group using wellIds
- Create Forecast Scenario
- Link Group to a Forecast Scenario
- Build Forecast Scenraio
# Getting Started
Here we assume, the reader has gone through the Getting Started and Authentication guides and already has an environment setup for development using the Petro.ai API. Following are the imports to be placed at the beginning of your script for the examples discussed in this guide to run,
import requests
Now first lets setup a requests session object, which allows us to persist certain parameters (i.e. request-headers, authentication) across multiple requests.
pai_appid = 'your-appid-here'
pai_apikey = 'your-apikey-here'
pai_base_url = 'https://yourcompany.petro.ai'
# DATA URL is used for posting and patching data
pai_data_url = pai_base_url + '/api/data'
# QUERY URL is used to query the data
pai_data_query_url = pai_base_url + '/api/data/query'
# TASKS URL is used to perform tasks
pai_tasks_url = pai_base_url + "/api/tasks"
pai_session = requests.session()
pai_session.headers.update({"Content-type": "application/json", "Accept": "application/json"})
pai_session.auth = (pai_appid, pai_apikey)
We will use this pai_session
object for the rest of this guide to make requests and interact with the Petro.ai API.
# Create a Petron
Following the examples discussed in Petro.ai Data API, the snippet below shows how to create a new petron by making a post request to /api/data
petron_insert_body = {
"type": "Petron",
"data": [
{
"name": "Tutorial Pad",
}
],
}
req_petron_post = pai_session.post(pai_data_url, json=petron_insert_body)
# Create a Group
Groups in Petro.ai are defined within the context of a Petron. For this when creating a group we will need to pass the pid
of the petron within which the group lives. We can get the pid
of a petron with a given name by querying for it as shown below,
# prepare query for petron with a given name
petron_query_body = {
"type": "Petron",
"query": {
"name": "Tutorial Pad"
}
}
# perform query for petron with a given name
req_petron_query = pai_session.post(pai_data_query_url, json=petron_query_body)
# Extract the query response
resp_petron_query = req_petron_query.json()
# Extract petron's id (pid) from the response body
# Beware of empty responses if no petron is found with the queried name
pid = resp_petron_query["data"][0]["id"]
The snippet below shows how to create a new group inside a given petron.
group_insert_body = {
"type": "Group",
"data": [
{
"name": "Tarrant Wells Group",
"pid": pid
}
],
}
req_group_post = pai_session.post(pai_data_url, json=group_insert_body)
# Add Wells to a group using wellIds
In this section we will add a list of wells to the group we created earlier using their UWIs (unique well identifiers). Here we are assuming these wells and their relevant production data already exists in your Petro.ai instance.
First we will query for the groupId of a group with a given name,
# prepare query for group with a given name
group_query_body = {
"type": "Group",
"query": {
"name": "Tarrant Wells Group"
}
}
# perform query for group with a given name
req_group_query = pai_session.post(pai_data_query_url, json=group_query_body)
# Extract the query response
resp_group_query = req_group_query.json()
# Extract group's id (groupId) from the response body
# Beware of empty responses if no group is found with the queried name
groupId = resp_group_query["data"][0]["id"]
Next we will use the Tasks endpoint to execute a Task which will take a groupId and a list of UWIs as options. This task will link the given wells to the given group.
well_ids = ["4250120130", "4250120131", "4250120132"]
# prepare task options to add given wells to a given group
task_add_wells_to_group_options = {
"groupId": groupId,
"wellids": well_ids
}
task_add_wells_to_group_body = {
"pid": pid,
"taskType": "AddWellsToGroup",
"options": task_add_wells_to_group_options,
"asJolt": False
}
req_wellid_to_group_task = pai_session.post(pai_tasks_url, json=task_add_wells_to_group_body)
Passing asJolt
as False
in the task body makes the execution of task blocking and your code will wait for the task to finish executing before moving on to the next line.
The wellids
passed in the task options can be read from a text file as well. Suppose the text file is formatted with one wellid per line as shown below,
4250120130
4250120131
4250120132
...
Following snippet reads the wellids from such a text file named as wellids.txt
,
well_ids = []
with open("wellids.txt", mode='rt') as f:
for x in f:
_wellid = x.strip()
well_ids.append(_wellid)
# Create Forecast Scenario
Now lets create a Forecast Scenario to auto-forecast the group of wells we created earlier. For creating a forecast scenario we again need the context of a Petron and hence will be using the pid
we found and set earlier. When auto-forecasting wells, users often play with the parameter ranges and other settings that have an effect on the quality of auto generated decline model fits, therefore when generating a forecast scenario we have to build a set of batchForecastSettings
as shown below,
batch_forecast_settings = {
"forecast": {
"years": 20,
"minProductionCount": 3,
"volumeAbandonmentDate": "01-01-2030",
"inactiveCutoffDate": "01-01-2019",
"limit": {
"oil": [],
"gas": [],
"water": [],
"oilTTD": [],
"gasTTD": [],
"waterTTD": []
}
},
"normalize": {
"peakFluid": "auto",
"startMode": "peakRate",
"gorThreshold": 10,
"targetValue": 1,
"typeCurveWeightColumn": "None",
"startDate": "2020-01-01T06:00:00.000Z",
"eol": {
"yearsOn": 20,
"yearsEnd": 3,
"enabled": true
}
},
"oil": {
"method": "arps",
"arps": {
"qi": [
1.0,
1.1
],
"de": [
0.1,
0.9
],
"b": [
0.1,
0.9
],
"dmin": [
0.06,
0.06
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
},
"gas": {
"method": "arps",
"arps": {
"qi": [
1.1,
1.3
],
"de": [
0.1,
0.9
],
"b": [
0.1,
1.5
],
"dmin": [
0.05,
0.08
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
},
"water": {
"method": "arps",
"arps": {
"qi": [
1.1,
1.3
],
"de": [
0.1,
0.9
],
"b": [
0.1,
1.5
],
"dmin": [
0.05,
0.08
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
},
"gor": {
"method": "none",
"arps": {
"qi": [
1.1,
1.3
],
"de": [
0.1,
0.9
],
"b": [
0.1,
1.5
],
"dmin": [
0.05,
0.08
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
},
"ogr": {
"method": "none",
"arps": {
"qi": [
1.1,
1.3
],
"de": [
0.1,
0.9
],
"b": [
0.1,
1.5
],
"dmin": [
0.05,
0.08
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
},
"wor": {
"method": "none",
"arps": {
"qi": [
1.1,
1.3
],
"de": [
0.1,
0.9
],
"b": [
0.1,
1.5
],
"dmin": [
0.05,
0.08
]
},
"piecewise": {
"interpolationScale": "log",
"numSegments": 2,
"yi": [
1.2,
1.2
],
"t": [
200,
2000
],
"y": [
-0.2,
-0.65
],
"outlierCutoffSD": 20,
"initialPeriodDays": 60,
"dmin": -0.06
},
"powerLawExp": {
"qi": [
1.1,
1.8
],
"alpha": [
0.001,
0.1
],
"beta": [
0.1,
0.8
]
},
"zeroThreshold": 0,
"modelFlagOptions": {
"excellentThreshold": 0.9,
"goodThreshold": 0.7,
"fairThreshold": 0.3
}
}
}
The following snippet creates a forecast scenario,
fs_insert_body = {
"type": "ForecastScenario",
"data": [
{
"name": "Tarrant Forecasts",
"pid": pid,
"options": {"batchParams": batch_forecast_settings},
}
]
}
req_fs_post = pai_session.post(pai_data_url, json=fs_insert_body)
# Link Group to a Forecast Scenario
Next we will add a link between the group and the forecast scenario created earlier. Linking them together lets us build forecast scenario with all the wells present in the group. First we will need to grab the id of the forecast scenario,
# prepare query for forecastScenario with a given name
fs_query_body = {
"type": "ForecastScenario",
"query": {
"name": "Tarrant Forecasts"
}
}
# perform query for forecastScenario with a given name
req_fs_query = pai_session.post(pai_data_query_url, json=fs_query_body)
# Extract the query response
resp_fs_query = req_fs_query.json()
# Extract forecastScenario's id (forecastScenarioId) from the response body
# Beware of empty responses if no forecastScenario is found with the queried name
forecastScenarioId = resp_fs_query["data"][0]["id"]
Following snippet shows how to create the link,
fs_group_link_insert_body = {
"type": "Link",
"data": [
{
"parentType": "ForecastScenario",
"parentId": forecastScenarioId,
"childType": "Group",
"childId": groupId,
"linkType": "input",
}
]
}
req_fs_group_link_post = pai_session.post(pai_data_url, json=fs_group_link_insert_body)
Similarly multiple groups can be attached as child to single parent forecast scenario. It can be done either as a separate request for each group or one request for all groups by adding elements in the data array in the body shown above.
One can even link another forecast scenario as child and that will bring the forecasts from the child forecast scenario into the the parent allowing them to be compared on the single well forecast page.
# Build Forecast Scenario
So far we have created a Petron, a Group and a Forecast Scenario; we have added some wells to the group and linked the group with the forecast scenario. Next we will need to build the forecast scenario in order to fit individual decline models to all wells present in the group. Following snippet shows how to build a forecast scenario,
# prepare task options to build forecast scenario
task_buildfs_options = {
"modelId": forecastScenarioId,
"clean": True
}
task_buildfs_body = {
"pid": pid,
"taskType": "BuildForecastScenario",
"options": task_buildfs_options,
"asJolt": False
}
req_buildfs_task = pai_session.post(pai_tasks_url, json=task_buildfs_body)
Passing clean
as True
in the build forecast scenario task options clears all the existing decline models for all the wells found in the linked group.
Also as mentioned earlier passing asJolt
as False
in the task body makes the execution of task blocking and your code will wait for the task to finish executing before moving on to the next line.
Now having built the forecast scenario, you can see the results by building the link as https://yourcompany.petro.ai/{pid}/apps/forecast-scenario/{forecastScenarioId} where pid
and forecastScenarioId
come from the ones we just created.
This example can be further extended to automate creating/managing multiple groups and forecast scenarios to organize the work for your Geos, Engineers, Managers, Tech and other members of the Asset Team.
Other business workflows like making a Type Curve or building an Economics Project or Frac Hit Scenario can also be automated using the Petro.ai API.