This document shows the ADF steps that can implement following requirements
- a source Azure blob storage account
adfsource05
and containersourcecontainer
- a target Azure blob storage acount
adfstarget06
and containertargetcontainer
- a MySQL table
filetable
defines two string columnssourcefile
targetname
. - Column
sourcefile
indicates which files (with path) need to be copied from source blob storage container (adfsource05/sourcecontainer
in above step 1) to target blob storage container (adfstarget06/targetcontainer
in above step 2), columntargetname
indicates the renamed file name once copied to target container.
**Following shows how to implement it by Azure Data Factory.
- Go to ADF and create a pipleline with name
DynamicCopyAndRename
, and drag the Lookup activity onto the canvas, and give the name asLookupMySQLTable
. - click the settings tab of the Lookup acitivity and add a new Azure MySQL dataset with corresponding MySQL link service like following
- Set the User query as Query and give the query statement as
select * from filetable
, remember clear the check of First row only, and then click Preview data to check the if the query result is expected - Add a ForEach activity and connect behind the lookup activity, give the name as
ForEachQueryRow
. at the settings tab, check Sequential and click Add dynamic content at Items field, add dynamic content as following capture. - Click "+" Activities icon inside ForEach activity, and select Copy Data activity to add into ForEach.
- Click this copy data activity and go to Sources tab, add Azure Blob Storage dataset with Binary file format (NOTE: you can choose other file format which aligns with your scenario), and create associated blob linked service which pointing to the source blob storage account
adfsource05
and save. - Click the Open button to reopen the dataset setting page
- Click Parameters tab to add parameters
- Add one string type parameter with name
ParamSoureFileWithPath
and click Connection tab. - At Connection tab, put your source container name
sourcecontainer
at container field, and click Add dynamic content at Directory field, give dynamic string as@substring(dataset().ParamSoureFileWithPath,0,lastindexof(dataset().ParamSoureFileWithPath,'/'))
- Similarly give following dynamic content at File name field
@substring(dataset().ParamSoureFileWithPath,add(lastindexof(dataset().ParamSoureFileWithPath,'/'),1),sub(length(dataset().ParamSoureFileWithPath),add(lastindexof(dataset().ParamSoureFileWithPath,'/'),1)))
- Click pipeline tab to go back to the Copy Data activity setting page
- Click Sink tab and create a new Azure Blob Storage dataset with Binary file format (NOTE: you can choose other file format which aligns with your scenario), and create associated blob linked service which pointing to the target blob storage account
adftarget06
and save. - Click the Open button to reopen the dataset setting page
- Click Parameters tab to add two string type parameters
ParamNewFileName
ParamNewFolderName
. - Click Connection tab, put your target container name
targetcontainer
at container field. set dynamic content as@dataset().ParamNewFolderName
at Directory field, and set dynamic content as@dataset().ParamNewFileName
at File name field. - Switch back to Pipeline tab and click Source tab of Copy Data activity. set dynamic value
@item().sourefile
at ParaSourceFileWithPath of Dataset properties (NOTE: the sourcefile in@item().sourefile
maps to the sourcefile column name of Lookup activity output). - Similarly to click Sink tab of Copy Data activity. set dynamic value
@item().targetname
at ParamNewFileName of Dataset properties (NOTE: the targetname in@item().targetname
maps to the targetname column name of Lookup activity output), as well as set dynamic value@substring(item().sourefile,0,lastindexof(item().sourefile,'/'))
at ParamNewFolderName property. - Click Publish all to save the pipleline configuration, and click Debug to test if the copy can be performed successfully as expected.