Office 365, the Microsoft Bot Framework, & Cognitive Services: 50,000 Foot View

Over the summer, I started working with the Microsoft Bot Framework for three reasons:

  1. Bots are a cool technology that has become mainstream
  2. It provides awesome new possibilities in terms of productivity and customer engagement
  3. The company that I work for is making a clear push toward AI and Machine Learning

I’ve been sitting on this blog post for a while because I didn’t want to regurgitate the information that I was sourcing from, then I landed a project that kept me busy, then I agreed to do a presentation on the topic, and finally, I ended up having to build a proof of concept with it.  NOW… I have some time to talk about it.

Building a bot is not overly complicated but building a bot is a little underwhelming if that’s all you do with it.  Integrating artificial intelligence, to me, feels like a must because it provides that next level of interaction that makes it worth while.  AI is what will make your bot feel like you’re not interacting with a bot… at least until it tells you that it doesn’t understand what you’re asking when you say “hello”, but we just need to account for those things.

This post will try not to be a repeat of info that is readily available.  Instead, I’ll walk you through a bot that I built which was a help desk triage bot.  It’s a pure proof of concept that I wouldn’t necessarily put in a production environment, but it shows you what’s possible.

The bot, which I named ANA is made up of 3 major components.

  • First, (and obviously) is the Microsoft Bot Framework which will handle the message handling.
  • Second, I leveraged Microsoft’s Cognitive Services (specifically the Language Understanding Intelligent Service or LUIS) to provide a simple level of interaction.
  • Third, I used the Microsoft Graph to further integrate it into the Office 365 ecosystem.

Setting up ANA

I’ll go over the steps I took to set my bot up but there are a few other resources available for getting started so I won’t do my normal step by step.

First, you can visit the Bot Framework Documentation site which has plenty of good information on getting started.  Really check this out, the bot framework isn’t a simple message handling framework.  It has some cool features like the ability to enter “3 days from now” and it knowing how to parse that text to assign a date value equivalent to 3 days after the current date.

Another source that I saw recently was by @zimmergren (Tobias Zimmergen) who recently did a fairly comprehensive post on building a bot via the Azure Bot Service. I used the steps for creating the bot using the .NET SDK.  I haven’t tried the Azure Bot Service steps as of yet so I can’t recommend an option.


In order to create the bot, you will need Visual Studio 2017.  I tried using VS2015 (because that’s what I had at the time) but the Bot Application template wasn’t available for it.  The nuget package can be found here and here is the Bot Template for Visual Studio.

If you need anything else, you’ll likely find the relevant resources in the Start Building Bots page.

You’ll also want to install the BotFramework-Emulator so that you could connect to your bot without having to deploy.  For information about the emulator, visit the Debug bots with the Bot Framework Emulator page.  You can find the emulator setup file on github.

The Project Setup

Create a new Bot Application project in Visual Studio 2017


Solution Structure

When you create your new Bot Application, you’ll get 2 classes.  The Message Controller and a Root Dialog.  The Dialogs are meant to be a way to compartmentalize your various types of communications.  Your message controller takes the message received by the user and determines what to do with it.  When you create a new project, it’s setup to send it to the RootDialog which does the simple task of counting the number of characters in your message.

public async Task Post([FromBody]Activity activity)
if (activity.Type == ActivityTypes.Message)
//await Conversation.SendAsync(activity, () => new Dialogs.RootDialog());
await Conversation.SendAsync(activity, () => MakeDialog());
var response = Request.CreateResponse(HttpStatusCode.OK);
return response;

Your web.config file also has 3 app settings that you’ll need when it’s time to deploy.

<!-- update these with your BotId, Microsoft App Id and your Microsoft App Password-->
<add key="BotId" value="YourBotId" />
<add key="MicrosoftAppId" value="" />
<add key="MicrosoftAppPassword" value="" />

Integrating the Language Understanding Intelligent Service (LUIS)

At this point, you have what amounts to as a Hello, World bot where you can see how things plug into each other.  I suggest you go over the Bot Framework documentation to get the details around creating dialogs and forms.  As you build your bot, you’ll find other things that you may want to incorporate.  For example, you may want to provide a Describe or Prompt attribute to change the way the Bot asks users for input.

If you want your bot to have some intelligence, you’ll want to start incorporating the Cognitive Services.  The Cognitive Services are so cool in my opinion.  They’re fairly simple to use and offer so much potential.

A very high level description of how they work is you feed it some input which will be different depending on which service you use and it returns a score.  That score is used to make ‘predictions’.  I know… that was maybe a bit too high level.  Let’s take a look at how I used it.

Ana has a few dialogs.  Some that I incorporated from demos, others that I created myself once I got the hang of it.  Two of those dialogs are the Greeting dialog and the HelpDesk dialog.

The Greeting dialog will receive a message like ‘hi’ or ‘hello’ and respond with a greeting of its own.  The Help Desk dialog was one that I developed once I got the hang of things.  Its job is to ask you questions about the kind of help desk support that you require and it takes that information and creates a task in Planner.  Again, we’re living in proof of concept land so this wasn’t intended for practical application.

Given that my solution accepts multiple dialogs or multiple types of conversations, a greeting and a help desk request, I need to be able to differentiate “hello” from “i need to submit a help desk ticket” and any variation of that.  This is where the Cognitive Services come in!

In the image below, you’ll see the list of Intents.  So when someone communicates with my bot, my bot will expect the intention to be a Greeting, a Help Desk request, or None which is where unknown requests go.  When this one is hit, that’s when you’re bot will throw it’s hands up… if it had hands.

LUIS Intents

Inside an intent is what is called an Utterance.  An utterance is the sentence that is associated with the intent and what the probability is that the sentence was intended to be.  If we look at the Greeting intent, we can see that there are 2 utterances associated with it.  ‘Hi’ has an 80% chance to be a greeting.  ‘Howdy’ has a 70% chance.

greeting utterance.PNG

Similarly, our help desk ticket has it’s own list of utterances.  In this case “I’d like to submit a help desk ticket” and “I’d like to submit a ticket” are both 100% intended to trigger my help desk ticket logic.

help desk utternance

The Suggested Utterances tab also stores a list of utterances that the service had no idea what to do with.  You can go in there and set the intent on the utterances so that those same words/sentences can also trigger the appropriate logic.

help desk suggested utterance

There’s more to it than just this but I highly recommend you read up on the Cognitive Services.  The site has some cool demos and there are other types of services like image recognition services and others.

LUIS and ANA Start Communicating

The following code starts with a couple of attributes.  LuisModel takes 2 guids and the text below shows you where you can get those.  The 2nd attribute is Serializable which is needed by the bot framework.

There are 3 methods which are also decorated with LuisIntent attributes.  The first has a blank value and this will be something that is caught in the None intent seen in the previous images.  The second intent is the Greeting intent.  This Greeting method will be called if the user enters Hi or Howdy as we saw in the Greeting intent.  Finally, we have the Help Desk Ticket, which is called… yep, you guessed it… when the Help Desk Intent is triggered.  There is also a HelpDeskTicket class that contains my the logic for submitting a ticket which uses the Microsoft Graph to create a task in Planner.

[LuisModel("<APP ID from LUIS Dashboard page>", "<key string from LUIS Publish App page>")]
public class LuisDialog : LuisDialog<object>

internal BuildFormDelegate<HelpDeskTicket> Ticket;

public LuisDialog()


public async Task None(IDialogContext context, LuisResult result)
await context.PostAsync("I'm sorry, I don't know what you mean.");

public async Task Greeting(IDialogContext context, LuisResult result)
context.Call(new GreetingDialog(), Callback);
public async Task Callback(IDialogContext context, IAwaitable<object> result)

public async Task HelpDeskTicket(IDialogContext context, LuisResult result)
if (this.Ticket is null)
this.Ticket = Models.HelpDeskTicket.BuildForm;
var helpDeskForm = new FormDialog<HelpDeskTicket>(new Models.HelpDeskTicket(), this.Ticket, FormOptions.PromptInStart);
context.Call<HelpDeskTicket>(helpDeskForm, Callback);


Data Access with the Microsoft Graph

The HelpDeskTicket class is where I kept my form logic.  Its fairly direct.  First it collects your help desk ticket information, then it authenticates, creates a Planner task object and assigns its values based on the user’s response, then it posts the task.

namespace Bots.Ana.Models
public enum RequiredService


public enum Severity

public class HelpDeskTicket
// the order of the variables determines the order of the questions
public RequiredService ServiceRequired;
public Severity IssueSeverity;
[Prompt("Tell me about your issue.")]
public String Description;

public static IForm<HelpDeskTicket> BuildForm()
// when the form is completed, run the submission logic
OnCompletionAsyncDelegate<HelpDeskTicket> submission = async (context, state) =>
await SubmitTicket(context, state);

return new FormBuilder<HelpDeskTicket>()
.Message("Welcome to the help desk ticketing bot!")

private static async Task SubmitTicket(IDialogContext context, HelpDeskTicket state)
string token = string.Empty;
context.UserData.TryGetValue("AccessToken", out token);

GraphServiceClient graphClient = SDKHelper.GetAuthenticatedClient(token);

GraphService service = new GraphService(context);
var me = await service.GetCurrentUser();

PlannerTask task = new PlannerTask();

task.Title = $"{state.ServiceRequired} issue reported by {me.DisplayName}";
task.PercentComplete = 0;
task.PlanId = ConfigurationManager.AppSettings["HelpDeskPlanner"].ToString();

// get the guid for the Planner Bucket from the web.config that matches the severity chosen
switch (state.IssueSeverity)
case Severity.High:
task.BucketId = ConfigurationManager.AppSettings["HighPriority"].ToString();
case Severity.Medium:
task.BucketId = ConfigurationManager.AppSettings["MediumPriority"].ToString();
case Severity.Low:
task.BucketId = ConfigurationManager.AppSettings["LowPriority"].ToString();
case Severity.Emergency:
task.BucketId = ConfigurationManager.AppSettings["HighPriority"].ToString();
task.BucketId = ConfigurationManager.AppSettings["NoPriority"].ToString();

PlannerTask planTask = await service.CreatePlan(task);

await context.PostAsync($"Your ticket has been submitted.  A technician will contact you soon.");


That was your 50,000 ft view of the application. I recently presented it at a user group where I went over the various integration points in Office 365. I showed the Bot running in Microsoft Teams and Skype. Here’s what the bot looks like in action.

First, a look at Microsoft Teams.

ana teams

Next, a look at Skype

ana skype

One thought on “Office 365, the Microsoft Bot Framework, & Cognitive Services: 50,000 Foot View

Comments are closed.