Skip to main content

APIS Modules

APIS modules define what the system can understand, what it can produce, and what it can execute.

In practice, modules are the building blocks used to describe project automation. They tell APIS which artifacts exist, which jobs can transform them, and which runtime components are available to execute the flow.

Why modules exist

A real project is rarely one flat repository with one output. It is usually a graph of repositories, artifacts, and delivery steps.

APIS uses modules to model that graph in a structured way:

  • Item modules describe artifacts such as archives, documentation, images, or 3D models.
  • Job modules describe transformations between artifacts.
  • Environment modules describe where execution happens.
  • Executor modules describe how work is executed.

This design lets APIS generate dependency-aware flows instead of relying on hardcoded scripts for each project.

How modules fit into flow generation

The codebase resolves modules into two main flow elements:

  • ApisItemData represents an artifact in the dependency graph.
  • ApisJobData represents a job that consumes one set of artifacts and produces another.

When APIS loads modules, it creates a graph of inputs and outputs. That graph can then be used to generate project flows, visualize dependencies, and trigger automation in local runs or CI/CD environments.

Module types

Item modules

Item modules define named artifact types. The current codebase already includes examples such as:

  • archives
  • SCP publication outputs
  • HTTP post outputs
  • executables
  • 3D models
  • documentation
  • PCB schema, BOM, CPL, and gerber outputs
  • images

Each item also belongs to a broader category such as FILE, DIRECTORY, DOCUMENT, IMAGE, TABLE, or DATA.

Job modules

Job modules define the actual transformations in the graph.

A job module declares:

  • a module name
  • a runtime image
  • its input artifact types
  • its output artifact types
  • an optional configuration dataclass
  • a stage in the execution pipeline

Stages are ordered in the current codebase as:

  1. INITIALING
  2. BUILDING
  3. TESTING
  4. PACKAGING
  5. PUBLISHING

This stage model gives APIS a predictable execution structure while keeping the concrete job logic modular.

Environment modules

Environment modules represent the execution context and expose a file tree plus execution hooks.

They are intended to model where APIS is running and what project data is available there.

Executor modules

Executor modules represent the component that actually performs execution.

This abstraction keeps job definitions separate from the mechanism that runs them.

Configuration model

Job modules can attach configuration objects derived from ApisJobConf.

Those configuration objects are made of typed parameters such as:

  • strings
  • passwords
  • IP addresses
  • URLs

That makes modules configurable without turning every job into a one-off script.

Custom modules

APIS is designed to load modules from packages and from module directories in project configuration.

This means a project can extend the built-in behavior with domain-specific modules for its own infrastructure, artifact types, and delivery rules.

The built-in modules currently present in the repository are documented on the next page.