Manipulating SharePoint content using Flow

One of the great benefits of Flow is to help you manage content in SharePoint.  There are 10 triggers and over 40 Flow actions available out of the box that allow you to access and update your SharePoint content.

Retrieve contentCreate contentUpdate contentOther actions

  • Get attachments
  • Get entity values
  • Get file properties
  • Get files (properties only)
  • Get items
  • List folder
  • Extract Folder
  • Get all lists and libraries
  • Get attachment content
  • Get file content
  • Get file content using path
  • Get file metadata
  • Get folder metadata
  • Get folder metadata using path
  • Get item
  • Get list views
  • Get lists
  • List root folder

  • Copy file
  • Copy folder
  • Create file
  • Create item
  • Create sharing link for a file or folder
  • Move file
  • Move folder
  • Copy file (deprecated)

  • Add attachment
  • Delete attachment
  • Delete file
  • Delete item
  • Update file
  • Update file properties
  • Update item

  • Approve hub site join request
  • Cancel hub site join request
  • Join hub site
  • Resolve person
  • Send an HTTP request to SharePoint
  • Set content approval status
  • Set hub site join status to pending


In many cases, the actions available to retrieve content will suffice.  However, there are times, where you may need to go beyond these actions to access your data.  One such example is updating the name and URL of a link in a list or library item.

Send HTTP request to SharePoint Action

One of the actions mentioned above is the Send HTTP request to SharePoint action.  This action enables you to interact with most aspects of your SharePoint Online environment (assuming your Flow account has the right permissions).  The other actions are more specific in their functionality.

The Send HTTP request to SharePoint action leverages the SharePoint REST service to achieve this using an extensive set of Create, Read, Update, and Delete (CRUD) operations.

Example 1: Updating Links

Let's look at a simple example of how to use the Send HTTP request to SharePoint action.  Suppose you have a list, where you want to store my favourite links.  For each link, you want to capture the following information:


  • Title - description of my link
  • URL
  • Link name - a short name
Using the OOTB Create Item action, you not able to set the Link name property on the link




As a result, when you create the item, the link URL and name will be set to the URL.  Clearly, not what you were hoping to do.


Now, instead of using the Create Item action, if you use the Send HTTP request to SharePoint, you can see how you can provide the additional information to properly set the link name.


There are a number of parameters required for this action that are standard in making REST calls.

  • Site Address - the site collection you are working in (the Flow account needs access to it)
  • Method - what type of RESTful operation will you be performing?
    • GET - retrieve content
    • POST - create some content
    • PUT - update content
    • DELETE - delete content
    • PATCH - make partial updates to content (e.g. only a few fields)
  • Uri - the URL to the entity on which you will be performing the action
  • Headers - contain important information to help you resolve issues with the requests
    • Authorization - Carries credentials containing the authentication information of the client for the resource being requested.
    • WWW-Authenticate - This is sent by the server if it needs a form of authentication before it can respond with the actual resource being requested. Often sent along with a response code of 401, which means ‘unauthorized’.
    • Accept-Charset - This is a header which is set with the request and tells the server about which character sets are acceptable by the client.
    • Content-Type -  Indicates the media type (text/html or text/JSON) of the response sent to the client by the server, this will help the client in processing the response body correctly.
    • Cache-Control - This is the cache policy defined by the server for this response, a cached response can be stored by the client and re-used till the time defined by the Cache-Control header.
  • Body - contains the content to be used for the REST call.  Each metadata field is applied a single value or array of values
In the example above, the Body is divided into 3 different metadata fields:


'__metadata': { 'type': 'SP.Data.SP_x0020_HyperlinksListItem' }Defines the item as a list item in the SP Hyperlinks list.  (NOTE:  the space in SP Hyperlinks is replaced in SharePoint by _x0020_)
'Title': 'Title'Title of the list item
'Link':
     {'__metadata': {'type': 'SP.FieldUrlValue' },
     'Description': 'Link name',
     'Url': 'URL'
}}
Link field value.  As you can see, the Description and Url are provided separately and assigned values from the Flow trigger.

As you can see, the link name and URL are properly set in the second item.


Example 2:  Retrieving items

When working with the Send HTTP request to SharePoint to retrieve items, you use similar settings on the action.  Rather than using the POST method, you'll be using the GET method.  As well, since you are only retrieving information, no body is required to complete the action.  In the example below, the action will retrieve all the items in the SP Hyperlinks list.

Caveat

Whether you use the Get Items or Send HTTP request to SharePoint action to retrieve the items in a list, the items returned will be the same; however, there are slight differences in the JSON format returned, which you need to consider when iterating through the items to retrieve specific values.

Strategy for enabling quick Flow Trigger or Action replacement

Every flow begins with a trigger.  In some cases, the business needs may change, which will require you to change the type of trigger being used.  If there are any references being made to dynamic content coming from the trigger, then you will notice as shown in Figure 1 that the Delete option for the trigger has disabled.  This is a mechanism used by Flow to ensure that you are not referencing dynamic content that is not available (NOTE:  it's possible to use expressions that refer to some dynamic data, which would not be checked by Flow, but would result in run-time errors.  Those examples are beyond the scope of this article).

Figure 1:  Trigger Delete option is disabled when dynamic content is reference in actions

The way to overcome this limitation is by storing the dynamic content you are referencing from the trigger in variables or other actions downstream.  Figure 2 shows an example, where the SharePoint list item ID is stored in a variable called Item ID.  Actions that need to refer to the trigger's Item ID can simply refer to the variable instead.

Figure 2:  Variable being used to store dynamic content from the trigger

Now, if you need to delete the trigger, simply delete its reference used in the Item ID variable.  All other actions downstream of the variable definition that refer to it will not be affected by this change.  As you see in Figure 3, the Delete option will now be enabled again on the trigger.  


Figure 3:  Removing reference to trigger dynamic content enables the deletion of the trigger


Once you delete the trigger, you will be able to select a new one a shown in Figure 4.

Figure 4:  Selection of a new trigger

In some cases, you may want to reference several dynamic content elements from the trigger.  Though it's possible to use one variable for each, it can get a bit more overwhelming to create so many variables.  Instead, you can use the Parse action that will store the output of the trigger.  To properly reference the fields within the Parse action, you need to define the schema.  One simple way to do this is to run the Flow without a schema (by setting the schema to {} ) and looking at the Body of the Parse function, as shown in Figure 5.

Figure 5:  Getting the schema of the trigger body

Save the body text and go back to editing the Flow.  Now, you can select the Use sample payload to generate schema for the Parse action to generate the schema.  Make sure that all fields are properly filled in the sample data; otherwise Parse will ignore them.

You can now reference the parameters from the Parse action schema in your downstream actions as depicted in Figure 6.  Keep in mind though that as you switch triggers, the type of data and variable names may change and you may need to modify the data to adjust it.  For example, a SharePoint list item identifier is stored in a field called ID.  If you work with a Dynamics 365 record, the identifier of the record is ItemInternalId.

Figure 6:  Referencing Parse parameters in downstream actions

You can even apply a similar approach to various actions throughout your Flow, where you may need to swap it out for another action.

SharePoint Saturday Detroit 2018

I'm excited to announce that I'll be returning to SharePoint Detroit for a third year in a row.  This year, I'll be presenting Business Process Automation is not Dead!

We've heard it all before - SharePoint solutions are dead. SharePoint Designer is deadish. InfoPath is not far behind. So what does all this mean for your workflow and forms solutions in SharePoint Online or your hybrid environment? Over the past couple of years, Microsoft has released a number of new apps and services that that not only fill the gaps, but provide you with great flexibility.


Join me in this session as I introduce some of these new technologies, including PowerApps, Forms, Flow, LogicApps, Azure Functions, and Azure WebJobs to name a few and discuss the scenarios to best leverage each of them.



Flow Approval Workflows with Office 365 and Azure AD Groups

If you've had a chance to use the Flow approvals, you'll agree that it's very easy to configure them and have users respond to them, even from within their email applications.  Building approval workflows is very simple.  All you need is is a trigger that will cause the approval to kick-off and provide a few key pieces of information:



  • Approval type - whether one person or everyone in the Assigned to field must respond to it before the workflow continues
  • Title - this will be the title of the email sent and the approval item in the assignee's environments
  • Assigned to - who will be approving the workflow
  • Requestor - who is requesting the approval

Workflow Assignees

The list of Assigned to is made up of one or more entries.  Each entry will receive an email with the approval notification.  There are three types of email-enabled entities in Office 365:
  • Individual users
  • Office 365 Groups
  • Azure AD Groups
The case of sending approval requests to individuals is straight-forward.  Simply add the individual email address in the Assigned to field and you're done.  However, it's a bit more complex when dealing with Office 365 or Azure AD groups.  Currently, the Approval workflow does not distinguish between individuals and groups.  If you try to type the name of a group, you will notice that the lookup will not resolve it.  In my case, I have a groups office365group and azureadsecuritygroup.  Here's what happens when I try to add them


So what do you do if you need to send approvals to members of groups?  You need to first identify what type of group you have and then retrieve the individual users.


Claims Encoding

One way to tell what type of group you have is to look at the claim that is associated with it.  Each email type is made up of different parameters in the claim encoding.  A claim is made up of 5-6 components, as defined in the SharePoint 2010/2013: Claims Encoding:

<IdentityClaim>:0<ClaimType><ClaimValueType><AuthMode>|<OriginalIssuer (optional)>|<ClaimValue>


When you look at the claims returned for an individual, Office 365 Group, or Azure AD Group, you'll notice the following patterns:

  • Individual - i:0#.f|membership|<user email>
  • Office 365 Group - c:0o.c|federateddirectoryclaimprovider|<GUID>
  • Azure AD Group - c:0t.c|tenant|<GUID>
Without understanding all the intricate details of each claim type, you can use these formats as fingerprints to the type of email address you're dealing with

Resolving Email Addresses

Now that we know how to distinguish between the various email types, the next step is to provide different instructions for your Flow to deal with each.  Below is a snapshot of how I generate a list of all approval emails



It looks complex, but is fairly straight-forward.  Here's the breakdown:

  1. Initialize a string variable that will hold all the email names, e.g. Approval Emails
  2. Get a list of all the email objects.  In my case, I retrieve it from a SharePoint list item People field.  I then iterate through each of the objects as follows
    1. Determine what is the claim format before the second pipe (i.e. |)
    2. If it's for single users, simply add them and a semicolon before them to the Approval Emails string variable
    3. If it's an Azure AD Group or Office 365, get the appropriate group by looking at the claim GUID following the second pipe (NOTE: make sure you use the correct connector to retrieve the groups)
    4. For each group, iterate through the list of users and add their email and a semicolon before it to the Approval Emails string
Once you finished iterating, you can then assign the Approval Emails string to the Approval Request action.

Caveats

As with many solutions, there are some potential caveats here:

  1. Approvals require at least one email address.  Make that the list of approvers (composed of the individuals and groups) constitute one or more email addresses.
  2. If your Azure AD group includes other groups, you'll need to modify this mechanism to drill down to extract the users as well.  This can be done using loops or recursion.

PowerApps Sharing screen just got a Face Lift

Say hello to the new PowerApps Sharing screen.  In the past, you were able to select users and share the app.  With the new UI, you know the specific permissions the users will need, such as gateways, API's entities, and various connectors. 


As a PowerApp creator, this can save you time from having to remember to verify the permissions or have users contact you about the problems they may be facing if they don't have the required access.

Staying Healthy with Flow and PowerApps

Getting sick is never fun, especially with small kids at home.  This is why it’s important that we take
our meds quickly to avoid prolonging our recovery.  In the past, I would stack up each on some common meds to be ready just in case I needed them.  However, I would still find myself either missing some, buying the same ones I already have, or having expired ones because I bought too many in the past.  Well, now that I have Flow and PowerApps, that is all in the past.

In this post, I’m describing a simple MedApp I built to help me keep track of the meds I have at home and replenish any that are about to expire.  The app serves a few purposes:

  • Keeps track of what apps I have at home and when they expire
  • Notify me when meds are about to expire
The solution is made up of three parts:  SharePoint list, Flow, and PowerApp

SharePoint List

The SharePoint list is the backend that stores all my key information.  A few fields were required for this solution:

Field Name
Description
Type
Title
Name of medication
Single line of text
Description
Description of medication
Multiple lines of text (plain text)
Expiry Date
When the meds expire
Date
Of course, you can track additional information about each drug if you choose to.

Flow

The solution has a time-triggered Flow that checks on a nightly basis if there are any meds that are about to expire (within two weeks) and sends a notification via email.  There are three actions and one control required to build this flow.

  1. Get a list of all items who's due date is within 15 days.  To do so, I'm leveraging the following Flow expression in the Get items action under the Filter Query field  formatdatetime(adddays(utcnow(),15),'yyyy-MM-dd')

    1. Create an HTML table for all the items using the Create HTML table action.  I only need the Title and Expire Date (Expiry_x0020_Date) for my email
       
    2. Finally, I use the Send email action to notify me of the medications that are about to expire
    The flow can easily be expanded to add items to a shopping list for other apps and services, such as Amazon.  In that case, you will likely need to add additional fields to the SharePoint list to add the item ID or links for these medications in the various apps or services.

    PowerApp

    To keep things simple, I'm using the Start from data template and modified a few fields to achieve the look shown above



    In particular, in order to get the EXPIRED label to show up, I'm using the following formula for the Visible parameter:  DateDiff(Today(),ThisItem.'Expiry Date',Days)<0

    Similarly, I'm using the expiry date to determine whether the bar on the left-hand-side of an item should be white (default), yellow (within 30 to 15 days of expiry), or red (less than 15 days of expiry).
    If(DateDiff(Today(),ThisItem.'Expiry Date',Days)<15,RGBA(255,0,0,1),If(And(DateDiff(Today(),ThisItem.'Expiry Date',Days)>=15,DateDiff(Today(),ThisItem.'Expiry Date',Days)<30),RGBA(255,255,0,1),RGBA(255,255,255,1)))

    Simple solution, but will hopefully help you stay healthy by having your meds nearby and prevent you from wasting money on buying too much.

    Creating PDF documents with images using Flow

    One of the nice document generation features of Flow is to create PDF documents from structured data, such as HTML.  However, if you want to integrate images, you may not see the expected result.  For demonstration purposes, suppose you want to have a PDF image that only includes.  The flow may look something like the image below:


    The Flow is broken down into 4 actions:

    1. Compose HTML File - used to compose the HTML body.  In it, you can see the HTML markup that points to an image (e.g. https://<YOUR TENANT>postmediacanoe.files.wordpress.com/2018/09/jackolanterns1000.jpg).
    2. Create file - this is where the HTML is converted into a temporary file, temp.html
    3. Convert file using path - here, the HTML file is converted into PDF.  Note that this action doesn't actually save the PDF file; it just converts the content.
    4. Create file 2 - now, I use the content from action 3 to save the PDF content into a PDF file.
    If you look at the temp.html file, you will see the image as expected:

    However, if you try to open the corresponding PDF file, the image is missing.

    The reason for this is that the Convert file using path action is expecting all the content to exist within the source document (the HTML file).  So, the way to get around this is by extracting the content of the image and including it directly within the HTML body.  Let's have a look at the updated Flow:
    I added the Get file content using path to capture the content of the Halloween.png file.  Next, I modified the Compose action to (specifically the <img ...> tag) to include the content of the image.  For the image content itself, you need to include the following expression:


    body('Get_file_content_using_path')?['$content']


    Note that the name of the body object may be different in your case.  Now, when you generate the PDF, you get the image included in it.

    Final Thoughts

    As you can see in the final image, the Halloween picture is larger in the PDF file than in the HTML file.  Make sure to structure your HTML carefully to properly size the images and other content.

    Bypassing Connector Authentication in PowerApps

    Have you ever had a PowerApp request your permission to access various services before it would load?  If so, then that PowerApp is leveraging these connectors in some way.  But what if you don't want each user to have to do that?  What if some of the services require permissions that the end user doesn't have?  In this post, I'll share an interesting work-around to this problem, broken down into 3 scenarios.



    When you are using Connectors in PowerApps, each user accessing the PowerApp needs to give permission to the specific connector the first time they use it.  If there are multiple connectors, then the user needs to authenticate each of them, as shown below.



    For this post, I've built a simple app to display the user's given name.  To get the given name, I'm using the Office 365 Users connector.



    So, how do we get it without having every user give explicit permissions to the Office 365 Users connector?

    Scenario 1 - Getting the Given Name directly from PowerApps

    In this first scenario, I have added the Office 365 connector directly to the PowerApp (View > Data sources > + Add data source > Office 365 Users).  The text field, then uses the Default method to set the value using Office365Users.UserProfileV2(User().Email).givenName


    If you try to access this PowerApp from another account, the first time, the user will be prompted with the following message:


    No good!  Try again.

    Scenario 2 - Use PowerApp to call a Flow to retrieve the Given Name

    In the second scenario, I have removed the Office 365 connector from the PowerApp and replaced it with a Flow called .  This way, the Flow is doing the work with the connector.  The Flow is fairly simple in structure as shown below.



    I also had to make the following modifications to the PowerApp

    1. Remove the Office 365 Users connector  (View > Data sources >  Office 365 Users > ... > Remove).  There's no more need for it
    2. On the Screen1, add a Flow for OnStart (Screen 1 > Actions > Flows > Get Given Name).  We need to connect the Flow to the PowerApp when the PowerApp is launched
    3. Change the function for OnStart to Collect(GivenName,'Get Given Name'.Run(User().Email)).  This way, the value will get loaded into a Collection
    4. Change the text field Default value to First(GivenName).givenname.  As the result is stored in a collection, we want to retrieve the first record and get the givenname field.

    Save, publish, and try again.  Strike 2!  The Flow is keeping the context of the user and is still requesting the user for permission.

    Scenario 3 - Leverage a Secondary Flow to do all the Connector work

    In this third scenario, I am leveraging a secondary flow to access the Office 365 connector to retrieve the user's given name.  The secondary flow is invoked using an HTTP trigger.


    Here's what the revised first Flow looks like


    By doing so, the Flow that is called from the PowerApp and the PowerApp itself have never leverage the Office 365 connector.


    Avoiding never-ending Flow triggering

    When Microsoft created the Do Until control action, fortunately they took care to avoid situations where loops would run forever by adding control limits that terminate the loop even if the logic condition is not met.  By default, a Do Until loop will terminate after 60 iterations or 1 hour (PT1H in ISO8601 notation).  Using these controls, you can limit the actions to loop up to 5000 times and within the span of 1 month (P1M in ISO 8601 notation). 


    However, when building flows that involve triggers which respond to item changes, it is possible to get into situations, where the same Flow will call itself forever.  Consider the example below.


    In this example, the Flow is triggered by When an item is created or modified  when an item is changed in the Email Recipients SharePoint list.  Following the trigger is an action that modifies the same item which triggered the Flow in the first place.  As a result, another iteration of the same Flow will be triggered as soon as that action completes. 

    You may not be able to completely avoid situations where Flow that is triggered by an item change event will change the same item.  In those scenarios, you could create an additional field who's sole purpose is to determine whether the item should be updated again or not. 


    Now, your field will be used to control when the changes should be occurring to the item and, thus, when the Flow should be triggered.

    Conditions, Loops, and Switches have moved

    If you've been using Flow recently, you may have noticed that the location of the Condition, Do Until, Apply to each, Switches, and Scopes have moved.  Previously they used to be available as part of the ...More button next to the + New step button or as part of the popup menu.



    No more.  You will notice that the ...More button as been replaced with a Save button.



    Instead, they are now integrated as new actions.



    A bit confusing, but same functionality.

    Sending Custom Emails with Flow

    About a year ago, I wrote an article on how you can use Flow to Mail Merge using Flow using a couple of SharePoint lists.  This works great for sending the same email to all recipients or even structuring the email in different ways based on metadata captured in the Flow.

    However, I have faced more and more scenarios, where Flows are created using a service account with elevated permissions that only a select number of users have access to, typically IT.  The content of the email, however, may be set by someone else, such as a marketing or communications person.  In this scenario, changing the email template would require contacting the IT person and requesting for specific changes to be made - not ideal.

    In this article, I am discussing the concept of using tokens inside an email template to allow easy formatting without any access to the Flow itself.

    Creating the Email Template


    We begin again with the email template.  In this example, we will create a template that has a few tokens incorporated in it.  I will create my email template in HTML format, but that is not a strict requirement.  Here's my template

    <h1>Welcome email</h1>
    <p>Hello <--FIRSTNAME-->,</p>
    <p>I have to say, your last name "<--LASTNAME-->" is very nice.</p>

    Not the best example, but I hope you get the point.  I am storing this example in my email template list in SharePoint.


    Building the Flow


    The flow is composed of four steps

    1. Get the email template
    2. Get list of recipients
    3. Replace tokens with desired text
    4. Send the emails

    Get the email template

    Start by retrieving the email message using the Get items.  To limit the results returned, use the Filter Query option.



    Make sure that you only have one email with the same title.  Otherwise, you may get the wrong template.  This can be set by making the Title field in the SharePoint list unique.

    Get list of recipients

    Similarly to getting the template, use the Get items action to retrieve the list of users who should receive an email.  In my case, I have a field called SendEmail to indicated whether a user should receive an email or not.


    Replace tokens with desired text

    Here's where the fun begins.  I used the Compose action for each token to replace the token placeholder with the actual value.


    Before updating the tokens, I store the email template in a variable.  This way, I can always retrieve the original template for the next email being generated.  You may recall that I have two tokens in this example - <--FIRSTNAME--> and <--LASTNAME-->.  Each one is being replaced separately, although it's possible to do all the updates in a single action.

    To replace the <--FIRSTNAME--> token with the name from the user in the SharePoint list, I use the following expression in the Compose action.

    replace(variables('Email Template'),'<--FIRSTNAME-->',items('For_each_Recipient')?['First_x0020_Name'])


    This action effectively looks up the <--FIRSTNAME--> token in my Email Template variable declared above and replaces it with the First Name of the user from the SharePoint list item.

    Next, I proceed with replacing the <--LASTNAME--> token with the Last Name of the user from the SharePoint list item.

    replace(outputs('Replace_First_Name'),'<--LASTNAME-->',items('For_each_Recipient')?['Last_x0020_Name'])

    You'll note that the second expression input is different.  That is because the input for the second Compose action is in fact the output of the Replace First Name Compose action.

    Send Email

    In the final step, I am sending the email to the users by looping through the list of all recipients from the Get Email Recipients action.  The body of the email is Output, which is the output of the Replace Last Name Compose action.


    Make sure that if you're structuring your email template as HTML (as I did), that you indicate in the Send an email action that it's HTML format.  Otherwise, you'll get a lot of tags in the body of your email.

    Once the email is sent, I then reset the Email Template variable to what it originally was with the token place holders so that the next iteration can correctly replace them.

    This approach works well, but there are two caveats to consider:
    1. if the tokens are incorrectly written, then they will not be updated in the email itself
    2. if you want to add additional tokens, then IT will need to add rules for parsing it and replacing it.

    Keep it Flow-ing...

    Triggering a single Flow from multiple SharePoint Lists or Libraries

    One of the many great uses for Flow is to perform certain actions when an item is added to a SharePoint List or Library.  This is easy to achieve using the When a File is created or When an Item is created triggers.


    However, what if you want to run the same Flow for triggers generated from multiple lists?  One option is to have one flow per list that will call the common flow.  Doable, but requires a lot of extra work and maintenance if the logic changes in any way.  What would simplify this process is an event that is easy to set up and configure on any list and library.

    Using an Email-based trigger

    In this example, I'm leveraging the Alerts mechanism that can be set up on each list or library to trigger a common flow.  The logic behind this approach is that when SharePoint sends an alert email on an item or document, it includes a link to the item and item ID, which are the key pieces of information you need.

    Preconditions

    To do so requires several things:

    1. Alert emails should be sent immediately
    2. Ideally, there should be a service account for all the flows
    3. The Alert Title needs to be consistent across all the lists for the flow


    Flow breakdown

    The Flow, depicted below, is broken down into 8 steps.


    The first step is the trigger, which acts when an email arrives for the user account that is running the Flow.  Alert emails are generated from the no-reply@sharepointonline.com system account, so you can configure the trigger to only act on messages coming from that address.  Also, make sure to configure the Subject Filter to match the Alert Title you set on your list.

    Steps 2 through 6 are mostly a text parsing exercise to extract the site URL, list URL, and item ID.  They are all embedded in the email body.  Step 3 helps clean up all the HTML tags, thus making it a bit easier to parse out the elements.

    Once you have your item ID and link, you're all set to get the item (step 7) and do with it as you wish.  Optionally in step 8, you can delete the alert email if you have no other use for it.

    Workaround to update SharePoint document metadata custom columns

    Microsoft has been making great strides in improving PowerApps functionality over the past year.  In late 2017, the product team announced the ability to set values for Managed Metadata (aka Taxonomy) columns in PowerApps and Flow.  This is great if you are using managed metadata fields in a SharePoint List, but does not work for SharePoint Document Libraries according to Connect to SharePoint from PowerApps.

    So how do you update the Document Library managed metadata?  Consider the fact that wherever you use the same managed metadata, the same term set, terms, and associated globally unique identifiers (GUIDs) are used.

    The work-around is to access the managed metadata and combine it with the Document Library data.  One way to achieve this is to create a SharePoint list that can be used for the sole purpose of hosting these metadata fields.  Then, when building the PowerApp, you can pull in all the data you need from the Document Library and the metadata-hosting list and pass it to SharePoint, a Flow, or any other downstream work action to update the Document Library managed metadata.




    Simple enough.

    How to easily store email attachments in SharePoint Online and extract data using Parserr

    One of my past time hobbies is renovating my home.  I find it quite relaxing to be able to step away from the screen and engage in an activity that lets me use my mind and hands in a creative way.  This is all good, except when it comes time to submit all my receipts for my annual tax return.  Dealing with all the receipts is a daunting task.

    A while back, I created my own personal digital record solution on SharePoint Online, where I’m tracking receipts, invoices, tax documents, and general statements separately (I’ll be discussing the details of this solution in another article shortly).  Each of these documents is defined by a different content type with its own metadata site columns.  For this example, I’m interested in the Total amount paid on the receipt.  As I do most of my shopping at Home Depot and there is the option to have the receipt sent to me electronically, I figured why not automate the addition of the document into my records library.  And so, I did, using Flow and Parserr.  The process is broken down into two separate flows as described below.

    Flow 1:  Copy the document into the Records Library

    In my case, I’m using Gmail to receive these emails.  The emails that are sent from Home Depot have two common identifiers:

    • The email title is “Your Electronic Receipt”
    • The sender is HomeDepotReceipt@homedepot.com

    Using this information, I’m able to set up a trigger that will kick-off the flow whenever I completed a purchase at Home Depot.


    Next, I extract the attachment, save it to the records library.  It’s important to capture the ID of the file that was created in SharePoint as that ID will be used a couple of time to update the document.  The first time I use the Id is to set the content type to be of type “Record”.



    Once that’s done, I’m sending the same email I got from Home Depot to Parserr.  I’m including the file ID in the first line of the body of the email.


    Flow 2:  Extract the Total from the Receipt and update the File’s metadata

    Before using Parserr to extract the total amount, I had to set up the service using instructions on Extracting Text from PDF Files.  Once that was done, all I needed to do was to register the Parserr service with Microsoft Flow, setup the “When an email is received” trigger, and then update the file properties with the correct amount.  As you can see in the image below, I’m explicitly converting the document ID to an integer and the amount to a floating number to ensure that they don’t cause errors.

    Further Enhancements

    This solution can help you cut down time in processing many receipts that are coming from the same source.  As there may be errors in the processing of files (for example, the OCR is picking up the wrong text), it would be good to set up an action that should fire off an email with the link to the document if conversion fails.

    Of course, you can set up this type of service for any source of digital receipts and even extract multiple pieces of information in a single flow.

    Employee Check-in Simplified using Power Automate

    Company employees who move between offices are often faced with a tedious process of having to check-in at the security desk to get a tempor...