icon

Pull Pipeline intermediate files

The Step downloads Pipeline intermediate files to a local folder.

The Step downloads Pipeline intermediate files to a local folder. These intermediate files are generated by Workflows in a Pipeline intended to be shared with subsequent Workflows.

Make sure to add this Step after you have uploaded the intermediate files. You can upload the intermediate files using the Deploy to Bitrise.io Step's Files to share between Pipeline Workflows input. The directories you specify will be archived and uploaded as a single file.

When uploading the Pipeline intermediate files, you must assign environment variable keys to them in the Files to share between Pipeline Workflows input. After downloading the files, the environment variable key will point to the file's local path.

When downloading a file, that was a directory originally, they are extracted, and the specified environment variable will point to the directory's local path.

By default, all files shared by any Workflow of the Pipeline are downloaded. This can be limited by setting the Intermediate file source input.

Please note that this step is designed to be executed on the CI only.

Configuring the Step

To configure the Step:

  1. Specify which Workflows' intermediate files to download in the Intermediate file source input. By default, all Workflows intermediate files will be downloaded.

NOTE: You can list multiple Workflows by separating them using a comma. For example: {workflow1},{workflow2}

  1. (Optional) Set the Enable verbose logging input to true if you want to log additional information for debugging purposes.

Similar steps

Deploys build artifacts to make them available for the user on the build's Artifacts tab. Sends test results to the Test Reports (build's Tests tab). Uploads Pipeline intermediate files to make them available in subsequent Workflows and also uploads Bitrise and user generated html reports.

Creates a pull request on GitHub with the specified details.

A step to retrieve your cache from a S3 bucket using custom keys with fallback. This should be used with the s3-cache-push step to store the cache. If you want to retrieve multiple items, you'll need run this step multiple times. Bucket Access For this step to work you'll need an user in aws with programmatic access to a bucket. The user should have permissions to list, get, and put objects in the bucket. You can set the credentials using the Bitrise Secrets with the keys specified in the inputs or set them directly in the inputs.

Share environment variables between Pipeline Workflows.