Skip to main content
The Serial Production App employs a robust and consistent method for uploading data, whether you’re using the API or the frontend interface. This article will break down the upload process into steps and explain the importance of its transactional nature.

The Upload Sequence

Step 1: Create a Process Entry

The first step in the upload process is creating a process entry. This is done using the createProcessEntry function, which sends a POST request to the /processes/entries endpoint. The function takes parameters such as process ID, component instance identifier, station ID, and operator ID.
createProcessEntry({
  process_id: string;
  component_instance_identifier: string;
  station_id?: string;
  operator_id?: string;
})

Step 2: Add Data to the Process Entry

Once a process entry is created, the next step is to add data to it. This is accomplished using the addProcessEntryData function, which sends a PUT request to the /processes/entries/:process_entry_id endpoint. Different types of data can be added:
  • Numerical data
  • Boolean data
  • Text data
  • File data
  • Image data
  • Link data
For example, adding numerical data might look like this:
addProcessEntryData({
  process_entry_id: string;
  type: 'NUMERICAL';
  dataset_id: string;
  value: number;
})

Step 3: Mark the Data as Submitted

The final step is to mark the process entry as submitted. This is done using the updateProcessEntry function, which sends a PATCH request to the /processes/entries/:id endpoint. This function can update the completion status, cycle time, and pass status of the process entry. When is_complete is set to true, the upload_error flag in the database is set to false. The inconsistent naming of the database column with the API interface is not ideal here but it’s just important to remember that is_complete maps to upload_error with an inverted boolean value.
updateProcessEntry({
  id: string;
  is_complete?: boolean;
  cycle_time?: number;
  is_pass?: boolean;
})

The Importance of Transactional Sequence

This three-step sequence is crucial for maintaining data integrity and ensuring that all necessary information is captured before finalizing the submission. The transactional nature of this process is reinforced by the upload_error flag in the database.

The Role of the upload_error Flag

The upload_error flag, which defaults to true in the process_entries table, plays a vital role in this transactional process:
  1. When a process entry is initially created, the upload_error flag is set to true.
  2. As data is added to the process entry, the flag remains true.
  3. Only when the entire sequence is completed successfully (creation, data addition, and submission) is the upload_error flag set to false.
This mechanism ensures that partially completed or interrupted uploads are not treated as valid entries. It allows the system to identify and potentially retry or clean up incomplete submissions.

Consistency Across API and Frontend

One of the strengths of this upload process is its consistency. Whether you’re using the Python API or the frontend interface, the same sequence of endpoints is used:
  1. POST to /processes/entries for creation
  2. PUT to /processes/entries/:process_entry_id for adding data
  3. PATCH to /processes/entries/:id for marking as submitted
This consistency ensures that the data integrity rules are applied uniformly, regardless of the upload method. It also simplifies maintenance and debugging, as the core logic remains the same across different interfaces.

Conclusion

The Serial Production App’s upload process is designed with data integrity and consistency in mind. By following a clear, three-step sequence and utilizing the upload_error flag, it ensures that all uploads, whether through the API or frontend, are complete and valid before being finalized in the system. This robust approach minimizes errors and provides a reliable method for capturing production data.