I was looking for some reasons to play with certain animation libraries. I ended up creating a simple call to action button that will “do stuff” and redirect the visitor to another page. After I got the animation working, I added some additional configurable settings. There’s still room for additional settings which I may get to but for now, it’s out on github.
So it’s just a simple button, with the following config settings:
Button Text Color
Toggle to open in new window
The solution was created using Aphrodite and React-Animations. React-Animations implements all of the animations available in Animate.css. Animate.css allows you to easily add simple animations like bouncing, flashing, shaking, fading, etc. The end result looks like the following.
This post is different for me as I typically write how-to articles and announcements for events that I’m either organizing or contributing to. This year was a little different for me as all of the virtual events that happened this year allowed me to contribute more to the community in ways that I haven’t been able to in the past. So I figured, I’d put this post together to recap what I’ve done this year and it can be a nice reminder for me down the road.
I’m not new to organizing events so things aren’t much different this year than last year other than last year I had to get up early, get on a bus, travel to Time Square, present for a few hours, and return while this year I rolled out of bed 45 minutes before the event and logged on. So what have I done?
Tri-State Office 365 User Group – I’ve been running this group for some time now and we’ve barely skipped a beat. We missed 1 month due to the pandemic and not having a venue, but we along with the rest of the world managed to pivot and keep on rolling. The user group has grown to 529 members.
Global Power Platform Bootcamp – This one was very little on me. If you’ve attended some of my other events, you’re familiar with Manpreet Singh, and he was running the show here but I backed him up on that one and helped promote it. This event was pre-pandemic and the last time I saw a group of people in person at a conference this year. This event drew roughly 75 attendees.
M365 Philly Virtual – I rebooted our local SharePoint Saturday event along with Manpreet and Mike Mukalian last year and we were already planning to bring it back this year when… well, you know. A lot of thought went to what to do about the event this year and we didn’t want to lose momentum. We were later joined by Tom Daly and we went virtual. The big difference other than it being virtual was that we moved it from Saturday and made it a 3 day event from noon to 4pm Eastern. The idea was to have a session going one during lunch for every time zone in the US. I wrote a post about what happened behind the scenes to lead up to the event including some of the things that happened during the event. M365 Philly will be the name moving forward and plans for the next one in 2021 are already underway and it will also be virtual. I also have some plans for the next in person event once things open up again. We had over 800 attendees throughout the 3 days.
Global Microsoft 365 Developer Bootcamp (times 3) – We recently completed 3 of these events and focused on different topics for each. Originally intended to be hands-on sessions capped at 30 attendees, we shifted to more of a presentation with brief periods of hands-on. Last year, we did this in 3 locations – Malvern, PA, Iselin, NJ, and Time Square. We presented the same topic at all 3 but given the it was likely we’d have people attend multiple events, we went with different topics for each. On day one, Mike Mukalian walked the audience through Teams governance and showed how to build a process to automate the creation of a Team. On day 2, Jim Novak, Manpreet, and I covered the Power Platform and how you can build an end to end solution using Power Apps, Power Automate, and Power BI. We also sprinkled in some AI Builder and Power Virtual Agents at the end. Each event drew roughly 80 attendees.
Speaking Engagements and Producer Gigs
I mainly do my speaking at the user group but this year presented a unique opportunity so I got to do a lot more of that. Most of my talks this year were AI focused. I spoke a few times at the user group about Azure Cognitive Services and how you can incorporate it into your day to day work. I refined it and then went on a (virtual) global tour! My first speaking engagement at a large conference was a the Lightup Virtual Conference back in July. That event was huge. 120 speakers, 5 tracks in English, 2 tracks in Spanish, 24 hours straight. In addition to speaking at that event, I was also a producer for some of the sessions. The event partnered with UNICEF and proceeds went toward children affected by COVID-19.
Following the Lightup Virtual Conference was M365 Philly Virtual where I presented my Intro to AI with Azure and Office 365 talk and I produced all of the sessions in the Eagles track on all 3 days. After that, I repeated the talk at several other events including the SharePoint User Group of DC, the Global AI on Tour – NYC, Collab365 Global Con 3, M365 Saturday Gurgaon, and M365 Chicago. In between all of those events, I also presented at Global Con 4. My session was titled “How to Extract data from files using AI Builder” where I used AI Builder to pull text from invoices and store them as metadata in a SharePoint document library. That one was cool. I put a timer on the screen and showed an unedited video of me creating the AI model and the Flow that used it.
I then wrapped the year up with the Global Microsoft 365 Developer Bootcamp mentioned above and just this week was our final user group meeting of the year where I repeated the AI Builder talk. We end the user group each year with an annual Lightning Talk. We had 6 speakers and 7 presentations in the 2 hours. It’s a nice way to end the year and we get to see a bunch of cool topics or demos and it gives new speakers an opportunity without having to commit to a full 45+ minute presentation.
In addition to the speaking engagements, I wanted to be more consistent with my writing. On my personal blog, I’ve written 14 posts and have quite a few lined articles that I’m working on. A few may come out this month if I can get it into gear. I’m taking some time off so I might spend some time on writing. My blog has always been something that I just like to do. I’ve had a blog for 12 years now. My numbers aren’t stellar but it’s still cool to see that 17k+ visitors have come through this year so far.
After M365 Philly Virtual, I setup a YouTube channel for SPS Philly and shared the presentations from the 3 days. I think next year, I’ll look for ways to post more content on there outside of the conference but I likely won’t share all of the content like we did this year. While none of the speakers opted out of sharing their content, seeing how many people, including myself, repeated their talks at other conferences, I wouldn’t want to take away from those conferences.
This year has been a weird one but also one with lots of opportunity. I’m grateful to all of the people that have allowed me to present at their events, and all the people who spoke and produced at events that I organized. I’m grateful to the attendees and sponsors who supported events that I’ve organized and sessions that I presented. I’m grateful to the people who helped organize events along side me and I don’t know how I can top this year, but I look forward to trying next year.
Every December, we at the Tri-State Office 365 User Group like to end the year with Lightning talks. Lightning Talks are 2 hours of rapid presentations. Each presentation is roughly 15 minutes long and they’re typically demos of cool things that people have worked on. In the past, we’ve seen demos showing how to provision sites using templates in SharePoint, SPFx development tips and tricks, a Power App that was used to catalog personal media, and other cool demos.
This is a great time to get involved in the community. If you have a cool demo to show, if you have an interesting use case, feel free to reach out and submit a topic.
If you are someone who has been interested in presenting but haven’t quite taken that step, what perfect way to do it than a short demo in a supportive environment. So if you are interested in participating on December 8, 2020 and have a topic that you want to share, join the meetup and use the contact the organizer link.
Do you still have an old, empty or near empty Team site as the homepage that people visit when they go to SharePoint? Has your organization attempted to use SharePoint as something other than a dumping ground for files but can’t seem to quite get there? There’s probably some months/years old company news on the homepage, a few links that most people have bookmarked, maybe a calendar with no recent events. In most cases, organizations don’t put in the proper time to plan for how they want to use SharePoint and there isn’t clear ownership. Eventually, SharePoint provides little value to your business because you haven’t properly tended to it. The overall fix is a topic for another conversation (or rant) but in this post, I’m going to offer up some suggestions for cleaning up your environment. Ultimately, governance, training, and adoption planning will need to be addressed but here are some features that will help you get your house in order.
1. SharePoint Home Sites: Your Landing Page
The landing page is the first impression. If it’s largely blank, has stale content, a handful of links, an empty calendar, a navigation menu with links that aren’t relevant/useful to most employees, how can you expect people to visit it? You probably don’t know where to even begin and for that, I suggest the SharePoint look book for inspiration or just use one of their templates.
For this example, I chose to provision a new site using “The Landing” template (2nd column, 2nd row in the image above) and my url is https://<tenant>/sites/TheLanding. It looks something like the image below. Instantly better than a typical, neglected homepage, right?
The great thing about these templates is that it takes a lot of the planning and second guessing out of your hands. It’s not completely a set and forget solution though. If you don’t use Yammer, then you don’t really want an empty Yammer feed on the page, right? The templates are a great starting point but you need to have someone own or maintain it. It’s great to have content automatically surface for the visitor like the My Recommendations section, frequent sites, and my recent documents, but someone needs to provide news or material that makes it worth the visit. One suggestion would be to stop sending HR communications or messages from the CxO via email and create a news article that people can read or look up. I know I would be grateful for less email clutter!
Setting a New Home Site
Once the above is ready to go live, you can you can designate it as your new home site by using the following PowerShell command:
Set-SPOHomeSite -HomeSiteUrl <your new site url>
NOTE: This does NOT replace your root site
So if it doesn’t replace the root site, what does setting this site as a home site do for us? I’m glad you asked.
For starters, the site’s search scope changes from searching only itself to searching all sites.
The site is also set as an Organization News Site which is a site that is flagged an an authoritative site. The result is that your news articles get highlighted. In this example, you can see an article published on The Landing and another on the Support site but only The Landing’s article is set as the authoritative site.
If you want to set other sites as Organization News Sites, you can use the following powershell command.
Set-SPOOrgNewsSite -OrgNewsSiteUrl <site url>
Another thing that happens when you set a Home Site can be seen on the mobile app. The home button on the mobile site will take you to the new Home Site (even though you don’t see the same in the browser).
Swapping Root Sites
I mentioned that changing the Home Site doesn’t change your root site. If you visit https://<tenant>.sharepoint.com, whatever was there before will still be there now but it’s simple to change that these days. Simply go to your Admin Center, select your current root site from the list of Active Sites, click the Replace site button, and provide the URL for the new site collection.
It’s actually a good thing that it doesn’t change right away. It allows you to test it out before fully releasing although, it does immediately change the experience from the mobile app. Let’s face it, if your site is stale and you’re looking for a quick way to bring value to it, then your users are probably not using the app either.
Another area that people find challenging is navigation. I typically see the problem being that people find it difficult to figure out what to provide links for or they want to provide links for everything. When I talk to customers, I tend to talk about the intranet and the collaboration areas as separate things where the intranet is more of a traditional communication vehicle with few people updating content and collaboration areas being more adhoc or less uniform than the other sites. There’s usually no good reason to have a traditional intranet with a link to a project site. You end up cluttering your navigation. Microsoft Teams is making that conversation easier to have because it seems to make that distinction clearer.
If you treat your intranet as a somewhat locked down communication tool, then you reduce the number of potential sites/pages that you want to navigate to. The natural thing that most do is create a link for each department but who says that’s what you need to do? Creating department links on an intranet implies that you will have similar communication sites for each department and from my experience, those sites are left to the departments to manage and most if not all won’t maintain them. If you insist on having links for departments, keep it simple. In fact, maybe don’t even create a site. Instead, just create a site page that your communications team maintains. Departments can submit articles to be posted but communications can approve them.
My opinion, I feel like if you have a link in the top nav, it should take you to a site that looks and feels similar. However, if you have a link in the body of the page, they feel more like links to “other” resources. With that said, I would use the Sites web part to show links to sites that the current user frequents.
Don’t Clutter you Nav with Collaboration Sites
Expanding on the previous thoughts on intranet vs collaboration and what they should look like… Places where people go to author content for projects or small teams are spaces that I like to keep separate from the intranet. Project sites where only a select few need access are areas that, for me, don’t need to have a dedicated link from the intranet. It can show up in the Sites web part like the screenshot above but if someone is maintaining a list of links in the top nav, or a mega menu, or anywhere for that matter, I wouldn’t. You don’t want to have to manually clean up a bunch of stale links when projects end.
I also feel that these types of sites shouldn’t have to follow the corporate standards for branding. Departments or project teams should be able to manage their own content however they feel works best. If one department likes to use search and metadata while another department prefers to use a hierarchy of folders, they should be allowed to do so. That’s the common comparison but there are others. Some businesses like to impose a specific folder structure in a library and it may not work well with a team so they just create another library and set it up their way anyway.
What if you have dozens of sites that have loose relationships with each other like project sites for a specific customer or even project sites related to a specific department? In those cases, I would consider leveraging Hub Sites.
3. Hub Sites
Hub Sites is a feature was released a little over 2 years ago. Hub Sites are a way to bolt sites together and allow those sites to share a navigation and theme. It also scopes your search to return results from sites associated with the hub and you can bubble up content from the associated sites up to that hub. In the days of subsites, if you decide to move a subsite, it was a bit of a pain because you had to make sure permissions and content came over. Now, you pick a hub that you want to join and you’re done. If you go through department reorgs or acquire another business, hub sites can simplify the grouping of those sites.
Register a Hub Site
In order to group these sites, you just need to select the site in the Admin Center and register it as a Hub Site.
Associate with a Hub Site
Once a site is registered, you can associate related sites to that Hub via the Site Information for the site being associated.
4. Audience Targeting
Another useful feature for organizing your site is audience targeting which allows you to show content to a select group. This is not a security feature but I used to see businesses use it that way. It is simply a way to ensure that relevant content is being displayed.
Audience targeting can be enabled on navigation links (image below), site pages, and certain web parts like the News and Highlighted Content web parts. You do have to be careful when using this feature. If you tag the wrong group or use it too much, it will cause confusion as site visitors may be content that they’re expecting to see or the opposite.
While the above is not a comprehensive list, I think they’re a good starting point. There are other features that are super helpful if you have the organization and team to get you there but can sometimes seem daunting for businesses that are looking for quick wins.
A really important feature is Content Types and that is a way to classify documents that you are working with. By default, the Content Type used in a library is “Document.” You can create additional content types with their own columns for a particular type of document. An example could be an Invoice with columns for Service Offered, Invoice Number, and Invoice Date which can be used to search or filter for specific files. You can then use those columns to create different views or generate better search results.
You ask anyone who’s worked with SharePoint and that is one thing that they’ll always tell you to use. It’s also something that businesses seem to struggle with or ignore. You get into what content do you commonly use, then what metadata is important, then what metadata can be shared, should we use the term store, who maintains the terms, how do we know when to use them, etc. If that’s you, know that in the long run, it will help you provide value but if you need a quick win, there’s a chance it won’t be front and center for you.
Another important feature is Search. I didn’t place this in the 4 for a couple of reasons. For starters, this can sometimes become a project too. Ensuring that results are relevant and content types are created to help surface the right content. You get search out of the box to begin with so it’s not that you won’t have search. Your results just might not be too great if you don’t take the time to plan and organize your site.
I definitely think you should look at ways to improve your search but what this article has been driving at is putting the most important things front and center on that home page and providing content (news) in a central place to jump start your intranet. So when I want to read about what’s new in the organization, who the new hires are, who got promoted, or resources/files that are frequently needed, I should be able to go to our intranet to find that information. If I go to a project site for Client ABC and their related projects are joined to a hub, I should be able to work my way to those sites and find the relevant content. In those cases, search will already be narrowed to those sites.
I want to make clear that the above items are tools that I think can get you to show something respectable. These are my thoughts on a bare bones solution for places that have struggled to get started. While there are other features that can also help, the ones listed here are a good start to get your site back in order. It doesn’t change the fact that you need to plan, learn, and get people on board to make sure it continues to succeed but sometimes you need a jump start to get you in that direction. In other words, you can build an MVP intranet but you need to plan for incremental improvements.
On November 13th, I’ll be speaking at M365 Chicago. If you haven’t caught my Intro to AI with Azure and Office 365 session, come and check it out. #M365Chicago is looking to be a huge event with over 80 speakers! If there’s a topic of interest to you, there’s no way it’s not being covered at this event.
If you want to catch my event, it’s currently scheduled to be at 2pm but the schedule hasn’t been finalized. This you can expect to see in my session:
For those of you who aren’t familiar with it, the Global Microsoft 365 Developer Bootcamp is a free, one-day training event led by Microsoft MVPs with the support of Microsoft and local community leaders. On November 7, 14, and 21, the Pennsylvania, New Jersey and New York Global Microsoft 365 Developer Bootcamps are back. The events will be virtual and tickets are limited. On each day, we will be doing hands on sessions.
On November 7th, we will be doing a workshop on automating and securing the provisioning of Microsoft Teams. You will learn about Teams Governance, automation, guest access, approvals, and more.
November 14th is Power Platform day. On this day, you will be learning about the different apps that make up the Power Platform and we will guide you through installing one of the samples apps. Additionally, we will get to see a demo of Power Virtual Agents for chat bot creation and we’ll show you how to use AI to automatically extract text from PDF files.
November 21st will be all about the SharePoint Framework. On this day, we will walk you through what the SharePoint Framework is and how to create your first web part.
We will be publishing the details on Eventbrite soon so stay tuned! Since the events are hands on, the team will be available to answer questions and help troubleshoot issues and as a result, tickets are limited for this virtual event.
If you’re a local business that specializes in the technologies above and are interested in sponsoring the event, email email@example.com for more information.
On October 20th, I’ll be doing a short, 15 minute presentation on AI Builder at Collab365 GlobalCon4. The Collab365 team sure knows how to organize an event and I’m excited to be involved. If you caught my session, “Intro to AI with Azure and Office 365” at GlobalCon3, you saw me walk through several AI based solutions. The goal was to show how simple it can be to incorporate AI using Azure Cognitive Services without having to understand machine learning.
At GlobalCon3, I walked the audience through several examples including:
It was wall to wall demos but they were surface level demos. I showed the construction but didn’t spend too much time on it. This time, I’m doing a turbo session and showing EXACTLY how simple these solutions can be. I’ll be recreating the AI Builder demo where I extracted text from invoices and updated the file metadata with that extracted text and I will be doing all of that in 15 minutes (I hope). No code, no machine learning. Just AI Builder, a SharePoint document library, and Power Automate. So join me at GlobalCon4, and be sure to check out the other awesome presentations at the event.
A few nights ago, I sat in on a PnP Sharing is Caring call with David Warner II and Hugo Bernier. The topic was using Node Version Manager (NVM) to handle different node versions on the same machine. I’m on a Windows machine so we talked about Node Version Manager for Windows since the original NVM was not written for Windows. The thing you find out right away when trying to use NVM for Windows is that it’s a separate project from NVM and it doesn’t have all the same features. In this post, I’ll talk about the 1 missing feature that I really wanted to use, and what I did to get it. The hint is in the title of this post.
For SPFx development, I use Docker for isolation. I had issues setting up the latest version of SPFx in Docker, but NVM seemed to be a great option. In our conversation, we talked about the idea of isolating by customer. As an example, let’s say I have a customer, Contoso, who wants to use Node v14.4.0 and Gulp.js while my other customer, Fabrikam, wants to use Node v14.4.0 and Grunt.js. One way to separate the two would be to dedicate a minor version to a customer. For example, Contoso would run Node v14.4.0 while Fabrikam runs on Node v14.3.0. Maybe you are inheriting a project that is running on a specific version of Node, then the above isn’t an option but you really want to keep the customer specific modules separate.
Note: v14.4.0 is used just for the purpose of setting up NPM and is not a supported version for SPFx development. For a list of node versions that are compatible with SPFX, take a look at the SPFx Compatibility Matrix,
I looked into the available commands for NVM and NVM for Windows and immediately noticed that NVM has more. I saw that they had an option for creating an Alias which NVM for Windows did not. Alias allows you to refer to a version by a name of your choice. What I want to be able to do is name each version by customer. Examples:
nvm use Contoso
nvm use Fabrikam
On the NVM git repo, you’ll see the following info about the tools:
nvm is a version manager for node.js, designed to be installed per-user, and invoked per-shell. nvm works on any POSIX-compliant shell (sh, dash, ksh, zsh, bash), in particular on these platforms: unix, macOS, and windows WSL.
I decided to enable Windows Subsystem for Linux (WSL) which you can do through the Windows Features. This should allow me to run the original NVM tool inside of Windows without having to use the limited Windows version of the tool.
Next, you need a Linux environment so I went to the Windows Store and did a search for Linux.
I went with Ubuntu and installed it. It was roughly half a gig in size. After installing Ubuntu, you’ll need to restart your machine.
Once the above tools are installed, I opened Windows Terminal but you can use PowerShell or your command prompt. The next steps should be the same on each.
Once your tool of choice opens, the first command you want to enter is Ubuntu to get you into the environment.
You’ll need to install curl in order to perform the upcoming nvm install.
sudo apt install curl
Next, we need to install NVM with the following command.
Once this runs, you’ll want to restart your terminal; otherwise, you’ll see a message telling you that NVM wasn’t found when you try to use the nvm command.
Once I had the above setup completed, I seemed to be good to go in terms of using NVM on my Windows machine. So the next thing I wanted to see, was could I have 2 versions of Node v14.4.0 with different modules.
When you install a node version (v14.4.0 in my case), you simply use the following command:
nvm install v14.4.0
If you try to create an Alias, NVM will create a file in an alias directory with the name of the alias and inside the file is just a version number. It’s basically mapping your alias name to the version folder.
If you want to see where those files are, you can type “explorer.exe .” in your command line (note the dot after exe) and Windows Explorer will open in the directory that nvm is installed.
If you go into the .nvm directory, you’ll see 2 folders that we care about right now.
Inside versions, you’ll see a node directory and inside that, you’ll see each version of node that you installed. Inside those directory, you’ll see the modules that you installed.
Inside the alias directory, you’ll see a file for each alias that you create.
Once I install Node v14.4.0, I will have a directory called v14.4.0 under /versions/node. I then created my first alias using the following command which tells nvm that we want to refer to the v14.4.0 directory as Contoso:
nvm alias Contoso 14.4.0
Next, I renamed the v14.4.0 directory to v14.4.0 Contoso. Then you need to make a change to the Contoso alias file since you changed the directory. Go to the alias directory, open the Contoso file and you should see one line with your version number. In my case, “14.4.0.” I added Contoso to that which tells nvm that when I use the Contoso alias (which matches my file name), load up whatever is in a folder called “v14.4.0 Contoso.”
Now we need to do the same steps for Fabrikam. Repeat the above steps which are:
Install the node version so that the folder is created
nvm install v14.4.0
2. Create your alias
nvm alias Fabrikam 14.4.0
3. Rename the “v14.4.0” folder found at /.nvm/versions/node/ to “v14.4.0 Fabrikam”.
4. Go to /.nvm/alias/ and open the new Fabrikam file. Edit the single line inside the file to include Fabrikam at the end so that it matches the name in step 3.
Your renamed folders should look like the following:
Your alias folder should look like this:
The Test Drive
Before we install any modules, let’s take a look at each alias and what they have inside. As you can see, we can check each alias to see what modules we have and all we have is firstname.lastname@example.org in both to start.
I’m going to switch back to Contoso and install gulp.
I then switch to Fabrikam and install grunt.
So now if I go back to use Contoso and list out it’s modules, we’ll see NPM and Gulp.
Switching to Fabrikam, I can see the list of modules includes NPM and Grunt.
My Linux skills aren’t too hot these days so I did have some issues. I installed Yeoman and when trying to run the Yo command, I would get an EACCES error which meant that I didn’t have the right permissions to a particular config folder. I’m using the root account but had to run the following command to get past the issue:
Basically we changed the permissions in that configstore folder to allow read, write, and execute for the user. I also had permissions issues in the npm folder and the folder where I was using yeoman so I ran the following commands to get past those.
The 1st line was a fix copied directly from error messages produced by the yeoman generator. The 2nd line was to remove a permissions issue thrown by yeoman in the folder where it was trying to create the project. I’m probably not following security best practices here, but these steps got me up and running.
In the end, I was able to get two instances of a node version using a different set of modules. If you are on Windows and can isolate by using different minor versions or if you don’t have an issue with having all your modules installed together, then NVM for Windows is all you will probably need. If you have a need to use the same version of Node for different projects that need to be separate, then the few steps above should get you there and using Windows Subsystem for Linux will allow you to install the original NVM so that you can leverage it’s alias feature.
I wanted to provide a follow up to M365 Philly Virtual. At the time of the event, I wasn’t sure how we were going to share the recordings. A YouTube channel was created for SPS Philly which is the group that organizes the M365 Philly (formerly SharePoint Saturday Philly) as well as the Tri-State Office 365 User Group. The user group meets on the 2nd Tuesday of each month and will meet virtually for at least the remainder of 2020 and likely the first 3-6 months of 2021.
The recorded sessions take some time to edit so it’ll take some time before they’re all released but subscribe to the channel and get notified as the sessions are released.
I’m kicking around some ideas for content for the YouTube channel. I would prefer to release content occasionally on the channel so that it’s not only populated when we have an SPS event.
If you’re interested in following along, there are several places where you can do so.
In the previous post, Getting Started with AI Builder’s Form Processing, I walked you through how to setup a model that will identify fields and their values in a pdf. In this post, I will show you how to use that processor to extract those values and store them as metadata for the pdf when it’s uploaded to a SharePoint document library – making the files filterable. Let’s dive right in.
Flows for AI Builder are special and do not follow the typical flow creation process. Typically, you would go to My Flows and start creating an automated flow. In this case, we want to go to Solutions and then click the New Solution button.
I’ll call this one SPL Invoice Processor. It also asks you to select a Publisher and I’m going to select the default publisher for the org. Your options are that or a CDS Default Publisher.
Once the solution is created, you’ll see it at the top of the page. Click on the link and then you can start creating your flow or other solutions. Think of SPL Invoice Processor as a container.
At the top of the page, you’re going to click on New and select Flow.
Give your Flow a name. Just to differentiate it a little, I’m going to call this flow SPL Invoice Processing Flow.
Next, you can select your trigger which in our case is going to be the “When a file is created in a folder” SharePoint trigger. In the screenshot, you can see that I’m searching for “file created” to narrow down my options below.
Once the correct trigger is selected, you can hit the create button and on the next screen, you’ll provide the url and document library name and then click the Next Step button to select your next action.
Next, you’ll want to filter down to the AI Builder actions and select Predict from the available actions.
Predict provides a single dropdown with a list of the available models. Select your model from the dropdown but if it’s not there, you may need to go back and publish your model. If it’s published, select “Enter custom value” which will turn the dropdown box into a text field where you can enter the name of your model. In my case, my model is called SPL Processor and it was in my dropdown. If you haven’t created a form processor at this point, check out my previous post where I walk through the steps needed to create one that will extract data from invoices. When you select a model, it’ll then ask you to select a document type and provide the document. Since I’m expecting pdf files, I’m going to enter “application/pdf” and provide the File Content that comes from the trigger.
We want to update the item fields values for the newly created/uploaded document. So I’m going to get the list of available properties first before I can update them. Add the Get File Metadata using Path action (If you use Get File Metadata, you’ll get an error when you try running the flow). This will allow you to get properties like the item id which will be needed to update that appropriate item in the library. For the File Identifier, we’re going to select x-ms-file-path-encoded which comes from the trigger.
Next, we are going to add the “Update file properties” action and get the details for our current file. We’re getting the item id from the previous action.
The Invoice Number, Client Name, Client Company, and Service Offered will come from the Predict actions output.
The end result will look like the following. Note that each entry that comes from Predict is the value so I’m providing “Invoice Number Value”, “Client Name Value”, and so on.
Now, it’s time to test this flow out. I’ll drop 5 invoices into the Invoices library and the following screenshot are my results.
One thing that you might notice above is that my invoices were 3 digits and you’re seeing a single digit invoice number above. That’s just SharePoint stripping the leading zero’s from the front of an integer. If I wanted to see 001 – 005, I could convert Invoice Number field into a Single Line of Text field. So there you have it. It’s pretty simple to automate the extraction of data from pdf files using AI Builder and Power Automate. This example was just a simple data extraction solution but you can do so much more. One thought that comes to mind is pulling the invoice total and if it’s above a certain number, kick off an approval process or notification. You can do the same if the number is too low. Maybe you have different steps depending on the type of service you’re offering? Once you can extract what you need, you can build all kinds of solutions based on that data. I encourage you to sign up for a free trial. It’s a pretty cool service.