首页 > 解决方案 > In ADF V2 - how to append date ("yyyyMMdd")) to filename dynamically for S3 dataset

问题描述

I'm currently working to automate a pipeline in ADFv2 where the source data sits in S3. A new file is created daily and is structured "data_20180829.csv"

I have tried to instrument dynamic content to accomplish this in the fileName field of Copy Data Activity. However even when I try something as simple as @{concat('data_','20180829.csv')} (that should resolve to the correct value) the source fails.

Is there any way to see what the dynamic content will resolve to?

标签: azure-pipelinesazure-data-factory-2

解决方案


This should just be a matter of setting the filename expression in the dataset, eg

Azure Data Factory dataset settings

Note, the setting is done on the dataset not at the Copy activity level. Also note you can make your expression more dynamic by combining the utcnow function with formatDateTime, eg something like this:

@concat('data_',formatDateTime(utcnow(),'yyyMMdd_HHmmss'),'.csv')

Note carefully the case of the formatting strings. Capital MM is for month in a two-digit format. HH is for the hour in 24-hour format. The full list of formatting strings is here.

The json for the dataset looks like this:

{
    "name": "DelimitedText3",
    "properties": {
        "linkedServiceName": {
            "referenceName": "linkedService2",
            "type": "LinkedServiceReference"
        },
        "annotations": [],
        "type": "DelimitedText",
        "typeProperties": {
            "location": {
                "type": "AzureBlobStorageLocation",
                "fileName": {
                    "value": "@concat('data_',formatDateTime(utcnow(),'yyyMMdd_HHmmss'),'.csv')",
                    "type": "Expression"
                },
                "container": "ang"
            },
            "columnDelimiter": ",",
            "escapeChar": "\\",
            "quoteChar": "\""
        },
        "schema": [
            {
                "type": "String"
            },
            {
                "type": "String"
            },
            {
                "type": "String"
            },
            {
                "type": "String"
            }
        ]
    }
}

推荐阅读