首页 > 解决方案 > 如何将多个作业中的工件发布和下载到管道上的一个位置?

问题描述

原始问题(检查下一部分的更新)

我想将多个作业生成的文件下载到 azure 管道上的一个文件夹中。这是我想要完成的架构:

jobs:                                                                           
 - job: job1                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.1

 - job: job2                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.2

 - job: check_prev_jobs
   dependsOn: "all other jobs"
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          mkdir -p  $(Pipeline.Workspace)/previous_artifacts
      - task: DownloadPipelineArtifact@2                                         
        inputs:
            source: current
            path: $(Pipeline.Workspace)/previous_artifacts       

其中目录$(Pipeline.Workspace)/previous_artifacts仅包含file.1file.2不包含目录job1 并且分别job2包含/file.1/file.2

谢谢!

更新

使用@Yujun Ding-MSFT's答案。我创建了以下azure-pipelines.yml文件:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1
      
    - job: Job_2
      displayName: job2
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2

- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          path: $(DIR)
          patterns: '**/*.time'
      - bash: |
          ls -lR $DIR
          cd $DIR
        displayName: Check dir content

但是,如下面的屏幕截图所示,我仍然将每个.time文件放在与作业相关的单独目录中:

在此处输入图像描述

不幸的是,在我看来,我想要的Pipeline.Artifacts可能无法实现,正如Microsoft 讨论中所解释的那样。鉴于Build.Artifacts在这一点上已被弃用,这将是一个遗憾。

标签: azure-devopsazure-pipelinesazure-pipelines-yaml

解决方案


在您目前的情况下,我们建议您可以将关键字:artifactName 添加到您的 publishArtifact 任务中。我修改了您的脚本并在我这边进行了测试。希望对你有帮助:

trigger: none

# pool:
#   vmImage: ubuntu-latest

jobs:
- job: Job_1
  displayName: job 1
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
    persistCredentials: True
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact
    inputs:
      path: $(Pipeline.Workspace)/file.1
      artifactName: job1
  
- job: Job_2
  displayName: job2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script copy
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.2 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact copy
    inputs:
      path: $(Pipeline.Workspace)/file.2
      artifactName: job2
      
- job: Job_3
  displayName: Agent job
  dependsOn:
  - Job_1
  - Job_2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: ' mkdir -p  $(Pipeline.Workspace)/previous_artifacts'
  - task: DownloadPipelineArtifact@2
    displayName: Download Pipeline Artifact
    inputs:
      path: '$(Pipeline.Workspace)/previous_artifacts   '

附上我的测试结果: 在此处输入图像描述 在此处输入图像描述

更新: 因为作业正在使用不同的会话运行。所以我们不能只复制文件或使用发布工件来帮助我们合并两个作业工件。我修改了您的 yaml 文件,这可能会对您有所帮助:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1

    - job: Job_2
      displayName: job2
      dependsOn: 
      - Job_1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job1'
          targetPath: '$(DIR)'
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2
 
- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job2'
          itemPattern: '**/*.time'
          targetPath: '$(DIR)'
          
      - bash: |
          ls -lR $DIR
          cd $DIR
          cd $(System.ArtifactsDirectory)
        displayName: Check dir content

附上构建结果: 在此处输入图像描述


推荐阅读