# Jobs
In the Petro.ai Application, users can execute various tasks to load and process data at regular intervals or on an ad-hoc basis. These tasks are called Jobs. The Job Manager allows users to create, schedule, execute, edit, and monitor jobs. Jobs are run by creating a mapping document called a job file that is in JSON format. When the file is provided to the job, it creates the mapping between Petro.ai fields and the user’s data to include units. Other options can also be passed via the job file, such as data connection type and extra field preferences.
# Job Manager
The Job Manager allows users to view, execute, edit, delete, and monitor jobs. Selecting a job will bring the user to the job’s execution history where job logs can be viewed. The log is useful when debugging job execution.
# Job Creation
Jobs are run by creating a mapping document called a job file that is in JSON format. When the file is provided to the job, it creates the mapping between Petro.ai fields and the user’s data to include units. Other options can also be passed via the job file, such as data connection type and extra field preferences.
Jobs can be run immediately, at a certain time, or on a schedule.
# Job Types
There multiple types of jobs that Petro.ai can run.
Type | Description |
---|---|
Load | Load jobs ingest data mapped via a job file into Petro.ai. |
Process | Processing jobs depend on data already in the PetroDatabase, and some will derive their own collections for further analysis. |
# System
System jobs are tasks like stored procedures that run based on various events. For example, these jobs may run denormalization functions, or create batch forecasts for users. These jobs are not viewable in the Job Manager.