Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save zplume/21248c3a8a5f840a366722442cf9ee97 to your computer and use it in GitHub Desktop.
Save zplume/21248c3a8a5f840a366722442cf9ee97 to your computer and use it in GitHub Desktop.

Cross-site collection file copy with SharePoint Online REST APIs in Microsoft Flow

About

This is a follow up to my previous post: Moving Files with SharePoint Online REST APIs in Microsoft Flow, which explains how to move/copy files with Microsoft Flow using the SharePoint - Send an HTTP request to SharePoint action and the SharePoint Online SP.MoveCopyUtil REST API.

This post demonstrates using a similar approach, but using the CreateCopyJobs and GetCopyJobProgress APIs to copy files from one site collection to another.

I suggest reading the previous post first for context if you haven't already.

Update: Flow now includes both 'Copy file' and 'Move file' actions which implement the functionality described in this post for you. Much easier!

Why not use 'out of the box' Flow actions for this?

While Flow already has a SharePoint - Copy file action, this is limited to copying within the same site (source and destination URLs are web-relative).

At the time of writing there isn't a SharePoint - Move file action.

How does CreateCopyJobs compare to SP.MoveCopyUtil?

Feature CreateCopyJobs SP.MoveCopyUtil
Cross-site collection move/copy Yes Yes
Copy list item field values Yes Yes
Copy version history Yes No
Overwrite destination file No Yes

While SP.MoveCopyUtil can overwrite an existing destination file, CreateCopyJobs can't. If you try to overwrite an existing destination file with CreateCopyJobs, the subsequent request to GetCopyJobProgress will return an error (within the Logs array), which will cause the Flow demonstrated in this post to terminate with status: Failed.

Creating the Flow

This Flow can be triggered from a single selected file in a document library to create a copy of that file in a document library in another site collection.

In reality this functionality would probably be more useful as part of a larger process (e.g. an approval workflow) and could be made reusable (triggered from multiple other Flows) by replacing the SharePoint - For a selected item trigger with the Request - When a HTTP request is received trigger - see the nested flow pattern.

For a selected item

  • Create a new blank Flow and add the SharePoint - For a selected item trigger:

For a selected item

  • Enter the source SharePoint site URL in the Site Address field via Enter custom value or by selecting your site from the drop-down if your site is listed there.
  • Enter your library GUID in the List Name field.
    • This is required because the List Name control can't currently resolve library names.
    • You can retrieve a library GUID from the URL of the library's settings page, which will be something like _layouts/15/listedit.aspx?List=%7B62f3e702-edd9-4a87-a469-60bb76c0784c%7D. The decoded List query string parameter in this example is {62f3e702-edd9-4a87-a469-60bb76c0784c}, but you're only interested in the value inside the curly braces, which is after %7B and before %7D in the library settings URL.

Get file properties

This action allows you to retrieve more information about the selected item, such as the file URL.

  • Add the SharePoint - Get file properties action:

Get file properties

  • Enter the source SharePoint site URL in the Site Address field via Enter custom value or by selecting your site from the drop-down if your site is listed there.
  • Select your document library for the Library Name field.
  • Select the ID value from the For a selected item trigger output for the Id field.

Compose CreateCopyJobs body

  • Add a Data Operations - Compose action and rename to Compose CreateCopyJobs body:

Compose CreateCopyJobs body

  • Paste the following JSON as the Inputs value:
{
  "exportObjectUris": [
    "@{body('Get_file_properties')?['{Link}']}"
  ],
  "destinationUri": "https://tenant.sharepoint.com/sites/siteName/Shared%20Documents",
  "options": {
    "IgnoreVersionHistory": false,
    "IsMoveMode": false,
    "AllowSchemaMismatch": true
  }
}
  • The exportObjectUris array item here is the Link to item output from the Get file properties action. If your Get file properties action has a different name, you'll need to adjust accordingly, or just select Link to item using the Flow UI.
  • Set the appropriate destinationUri absolute library URL value. Here the destination library is hard-coded, but you could set this dynamically based on some logic in the Flow.
  • IgnoreVersionHistory: false means that version history will be copied.
  • IsMoveMode: false means that the source file will not be deleted after the copy. Set this to true if you want the source file to be deleted once the file has been copied.
  • AllowSchemaMismatch: true means that the file will be copied (rather than the API throwing an error) in the event that some fields from the source item are missing in the destination library. Set this to false if you want the copy operation to throw an error instead in this scenario.

CreateCopyJobs

This action will add the copy operation to a queue.

  • Add a SharePoint - Send an HTTP request to SharePoint action and rename to CreateCopyJobs:

CreateCopyJobs

  • Configure the action as follows:

Properties

Property Value
Site Address The source SharePoint site URL
Method POST
Uri _api/site/CreateCopyJobs
Headers See Headers table below
Body The Output of the previous Data Operations - Compose action

Headers

Name Value
Accept application/json; odata=nometadata
Content-Type application/json; odata=verbose

Parse CreateCopyJobs response

  • Add a Data Operations - Parse JSON action and rename to Parse CreateCopyJobs response (this name will be referenced later in an expression):

Parse CreateCopyJobs response

  • Select the Body output from the previous CreateCopyJobs action as the Content field value.
  • Paste the following JSON as the Schema field value:
{
    "type": "object",
    "properties": {
        "value": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "EncryptionKey": {
                        "type": "string"
                    },
                    "JobId": {
                        "type": "string"
                    },
                    "JobQueueUri": {
                        "type": "string"
                    }
                },
                "required": [
                    "EncryptionKey",
                    "JobId",
                    "JobQueueUri"
                ]
            }
        }
    }
}

Set CopyJobComplete, Compose GetCopyJobProgress body and Set GetCopyJobProgressResults

These actions set up the variables and data needed for the following do until loop.

Parse CreateCopyJobs response

Set CopyJobComplete

This variable is used in a do until loop to exit once the copy operation has completed.

  • Add a Variables - Initialize variable action and rename to Set CopyJobComplete.
  • Set the action Type to Boolean.
  • Set the action Value to false.

Compose GetCopyJobProgress body

This action creates the JSON body needed for the GetCopyJobProgress request within the do until loop.

  • Add a Data Operations - Compose action and rename to Compose GetCopyJobProgress body.
  • Paste the following (incomplete) JSON into the Inputs field:
{
  "copyJobInfo": 
}
  • To set the copyJobInfo property value, position the keyboard cursor on the right-side of the : character (as if you are about to type the value), then enter the following Flow expression into the expression dialog and click Update:

first(body('Parse_CreateCopyJobs_response')?['value'])

  • This sets the value of copyJobInfo to the first item in the CreateCopyJobs response's value array, which is an object containing the EncryptionKey, JobId and JobQueueUri properties required for making requests against the GetCopyJobProgress API.
  • The Compose GetCopyJobProgress body action should look like the image above.

Set GetCopyJobProgressResults

This array variable is used to store the results of the GetCopyJobProgress requests that are made with the do until loop.

  • Add a Variables - Initialize variable action and rename to Set GetCopyJobProgressResults.
  • Set the action Type to Array.
  • Set the action Value to [].

Wait for copy job to complete

This section of the Flow uses a do until loop and the GetCopyJobProgress API to check if the copy operation has completed and retrieve related logs. If the copy hasn't completed, the loop will wait for 5 seconds before checking again, until the Count number of iterations has been reached, or the Timeout time exceeded.

  • Add a do until loop action and rename to Wait for copy job to complete.
  • For the condition, select the Copy job complete variable is equal to and the expression true.
  • Expand the Change limits section.
  • Specify a Count value such as 10. This is how many times the GetCopyJobProgress request will be sent before giving up, with a 5 second delay in-between each request, meaning timeout after 50 seconds. You can increase the delay and/or Count value if dealing with larger files.
  • Optionally adjust the Timeout value to something appropriate such as PT15M - 15 minutes. It's not super important to set a specific value for this, provided it's longer than Count * 5, as the loop will end on whichever condition is met first (number of iterations or total time).
  • See https://en.wikipedia.org/wiki/ISO_8601#Durations for more details on valid values for Timeout.

Wait for copy job to complete limits

Add actions within the do until loop

  • The next step is to add actions within the loop, so that it looks like the following image:

Wait for copy job to complete

GetCopyJobProgress

  • Add a SharePoint - Send an HTTP request to SharePoint action and rename to GetCopyJobProgress:

GetCopyJobProgress

  • Configure the action as follows:

Properties

Property Value
Site Address The source SharePoint site URL
Method POST
Uri _api/site/GetCopyJobProgress
Headers See Headers table below
Body The Output of the Compose GetCopyJobProgress body action

Headers

Name Value
Accept application/json; odata=nometadata
Content-Type application/json; odata=verbose

Parse GetCopyJobProgress response

  • Add a Data Operations - Parse JSON action and rename to Parse GetCopyJobProgress response:

Parse GetCopyJobProgress response

  • Select the Body output from the GetCopyJobProgress action as the Content field value.
  • Paste the following JSON as the Schema field value:
{
    "type": "object",
    "properties": {
        "JobState": {
            "type": "integer"
        },
        "Logs": {
            "type": "array"
        }
    }
}

Append to GetCopyJobProgressResults

  • Add a Variables - Append to array variable action and rename to Append to GetCopyJobProgressResults:

Append to GetCopyJobProgressResults

  • For Name specify GetCopyJobProgress results.
  • For Value specify the Body output of the Parse GetCopyJobProgress response action.

If CopyJobComplete

This condition checks to see if the copy operation has completed and either updates the Copy job complete variable to true (which exits the loop), or waits for 5 seconds before the next iteration.

  • Add a Condition and rename to If CopyJobComplete:

If CopyJobComplete

  • For the condition, enter JobState from the Parse GetCopyJobProgress response action with is equal to selected and the value 0.
  • In the Yes branch, add a Variables - Set variable action which updates the Copy job complete variable value to true.
  • In the No branch, add a Schedule - Delay action, with Count: 5 and Unit: Second.

The finished Wait for copy job to complete loop should look like this (with child actions collapsed):

Wait for copy job to complete

Log CopyJobComplete, Log GetCopyJobProgressResults and Set AllLogs

The two 'Log' actions shown here are Data Operations - Compose actions which are used as a way to inspect the value of variables at a particular point in the Flow when looking at a Flow run (e.g. when debugging). This is a neat trick I picked up from John Liu - cheers John!

The All logs array variable will store reformatted Logs returned by the GetCopyJobProgress API, where the Logs array JSON string values are converted to JSON objects (making them easier to work with).

Log CopyJobComplete, Log GetCopyJobProgressResults and Set AllLogs

Log CopyJobComplete

  • After the Wait for a copy job to complete loop, add a Data Operations - Compose action and rename to Log CopyJobComplete.
  • Set the Inputs value to the Copy job complete variable.
  • This makes it easy to tell for a particular Flow run whether the copy completed within the specified timeout or not.

Log GetCopyJobProgressResults

  • Add a Data Operations - Compose action and rename to Log GetCopyJobProgressResults.
  • Set the Inputs value to the GetCopyJobProgress results variable.
  • This gives a view of the GetCopyJobProgress API results (before doing some additional processing, in case there are any errors with this).

Set AllLogs

  • Add a Variables - Initialize variable action and rename to Set AllLogs.
  • Set the Name to All logs.
  • Set the Type to Array.
  • Set the Value to [].

Process GetCopyJobProgress results

  • Add an apply to each loop and rename to Process GetCopyJobProgress results:

Process GetCopyJobProgress results

  • For the Select an output from previous steps field, select the GetCopyJobProgress results variable - as the name suggests, this is looping over the responses of the GetCopyJobProgress API requests.
  • Add another apply to each loop within the Process GetCopyJobProgress results loop and rename to Process Logs.
  • For the Select an output from previous steps field of the Process Logs loop, enter the expression items('Process_GetCopyJobProgress_results')['Logs'] - here we are looping over the Logs in each GetCopyJobProgress response.

Process Logs

Here we want to convert each Logs entry from a JSON string into an object, view the object and add it to the All logs array:

Process Logs

  • Add a Data Operations - Compose action and rename to Log LogItem.
  • Set the Inputs value to the expression json(items('Process_Logs')) - this converts the string to an object.
  • Add a Variables - Append to array variable action.
    • For Name select All logs.
    • For Value select the Output of the Log LogItem action.

Log AllLogs and Filter Errors

As the heading suggests, we want to use the Data Operations - Compose action to display the value of the All logs variable (for debugging when looking at a Flow run), and create another array that only contains errors - this is useful both for inspection during debugging and subsequent logic based on whether or not there were any errors:

Log AllLogs and Filter Errors

Log AllLogs

  • Add a Data Operations - Compose action and rename to Log AllLogs.
  • Set the Inputs value to the All logs variable.

Filter Errors

  • Add a Data Operations - Filter array action.
  • For the From field, select the All logs variable.
  • For the condition, enter the expression item()['Event'] with is equal to and a value of JobError.

If the move or copy failed

The final step of the Flow is a condition which allows handling copy success or failure. In this example we're terminating the Flow as a Failure if there was an issue, so that it's obvious from the Flow history when something has gone wrong. You may want to expand on this, depending on your requirements:

If the move or copy failed

  • Add a condition and rename to If the move or copy failed.
  • Click on Edit in advanced mode.
  • Paste the following expression:

@or(equals(variables('Copy job complete'), false), greater(length(body('Filter_Errors')), 0))

  • The above expression says: If Copy job complete is equal to false (probably indicating a timeout) OR if there were some errors returned in the GetCopyJobProgress Logs, enter the Yes branch, otherwise enter the No branch.
  • Add a Control - Terminate action to the Yes branch.
  • Optionally specify a Code and Message, but these aren't really needed, as you can look at the Flow run (particularly the output of the Log AllLogs and Filter Errors actions) for details if the Flow fails.
  • Any actions that should take place after a successful copy should ideally go after the If the move or copy failed condition (rather than in the No branch) to keep things as clean as possible.

Hopefully someone finds the above useful 🙂

@ingrid-schiele-fnt
Copy link

Hi @zplume,
I've found an interesting article which might be interesting for you https://techcommunity.microsoft.com/t5/power-apps-power-automate/power-automate-sharepoint-file-copy-not-keeping-the-created-by/m-p/1626026.
They describe some more parameters to config the CreateCopyJobs:
"IsMoveMode":true,
"MoveButKeepSource":true,
"NameConflictBehavior":1
With this is it possible to overwrite an existing file in the target and modify the version history in different ways.
Cheers, Ingrid

@a-t-u-l-18
Copy link

hello,

I have tried to copy or move the document set from one site collection to another site collection, but facing issue with the column (Multiline text with appended text) column, and only the last value is getting copied in the destination.

Not sure if someone has faced this issue ?

@jalcala11
Copy link

Hi,

I'm trying to use this API, but always get the BadGateway error.
image
image

I saw this note in the Microsoft portal.
image

Does anyone know about this new API? or how can I get this one to work?

@joymon
Copy link

joymon commented Sep 19, 2023

@jalcala11 The deprecation is to CreateCopyJob. It is replaced by CreateCopyJobs. Suffixed 's'

@joymon
Copy link

joymon commented Sep 19, 2023

hi @zplume ,
I could see you are checking the status is 0 to decide whether the job was completed and logging the Logs property. But will this be a success or failure? In case of failure, what is the number you are expecting? Let me know if I missed something

@mnoah66
Copy link

mnoah66 commented Nov 13, 2023

Where do the two connectors "Log AllLogs and Filter Errors" go? Into the "Process logs" loop, or the parent "Process GetCopyJobProgress results" loop?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment