For a while I have wanted to better understand how to get the most useful information possible from Citrix DaaS (formerly Citrix Virtual Apps and Desktop service). I have extensively used the Citrix DaaS Remote PowerShell SDK to automate tasks in Citrix DaaS such as creating catalogs or delivery groups. You can also use this SDK to get information about current VDAs or sessions.

But what if you want to get information about things that have happened in the past?

The only way to query this type of information is to use the Citrix Monitor API. I have always wanted to learn to use the Monitor API, but there was a steep learning curve for me to be able to script OData queries with PowerShell (my go-to scripting technology), and I never really took the time to figure it all out.

Then, a customer wanted to be able to enter a list of user accounts in a CSV file and have a script look up the last time they logged on to a Citrix session. This isn’t easy to do with the built-in reports in Citrix Monitor, so I figured out how to script against the Monitor API, which was really not as hard as I thought it would be. In this blog post, I wanted to share what I learned in the process.

Citrix Monitor API

Let’s start with a quick review of the Monitor API. It can be used to query lots of useful information from Citrix DaaS, including:

  • Data relating to connection failures
  • Machines in a failure state
  • Session usage
  • Logon duration
  • Load balancing data
  • Hotfixes applied to a machine
  • Hosted application usage
  • Machine-level CPU, memory and disk usage
  • Process-level CPU and memory usage
  • Application usage and failure data

You can use this information for one-off queries or to integrate with your enterprise-reporting platform.

The API allows you to access the Monitor database tables. The documentation provides table schema diagrams for all relevant tables. For the example script, I used this schema diagram to find the tables and fields I wanted to include.

Then you must figure out which API call to start with. The API reference lists several good OData links for tutorials and descriptions of how to work with OData. To start with, you must figure out which base URL you will use to access the data you need. In the documentation, the Appendix contains a list of “Available Data Sets.” Some examples are:

  • http://{ApiGatewayEndpoint}/Catalogs
  • http://{ApiGatewayEndpoint}/Machines
  • http://{ApiGatewayEndpoint}/Users
  • http://{ApiGatewayEndpoint}/SessionMetrics
  • http://{ApiGatewayEndpoint}/ProcessUtilization

You can use these as the base URL for your OData calls, which you can expand using OData to link other tables in your queries.

One thing to note about these URLs: The API endpoints are different based on your Citrix Cloud tenant region. Currently these are:

  • US region: https://api-us.cloud.com/monitorodata
  • EU region: https://api-eu.cloud.com/monitorodata
  • AP-S region: https://api-ap-s.cloud.com/monitorodata
  • Japan region: https://api.citrixcloud.jp/monitorodata

Also keep in mind that OData limits returned 100 records per page. Need more? You’ll have to manage pagination in your script. This was not necessary for my example script, but I will cover it and pagination later.

Our Example – Last Logon Time

PowerShell Script: LastLogon.ps1

Now that we’ve covered some basics of using the Monitor API and OData, let’s look at how all this works with an example of finding the last logon time for a user session.

First, I needed to determine which base URL to use. When I looked at the schema, I saw that the data I need for this is in both the Session and Connection tables. In Session, I see StartDate and EndDate and in Connection, I see LogonStartDate and LogonEndDate. But I need to capture the use, as well, and there’s no link from the Connection table to the User table. There is a link from the Session table to the User table, so what I really need to start with is Sessions, then join in the User table and the Connection table to get endpoint information.

If you need more information on the actual connection, you might want to start with the Connection table, then bring in both the Session and User tables. To make this a bit easier to understand, here is the information from each of those tables, pulled from the schema diagrams listed earlier.

Session(MonitorData) Connection(MonitorData) User(MonitorData)
SessionKey Id Id
StartDate ClientName Sid
LogOnDuration ClientAddress Upn
EndDate ClientVersion UserName
ExitCode ClientPlatform FullName
FailureDate ConnectedViaHostName Domain
ConnectionState ConnectedViaIPAddress CreatedDate
ConnectionStateChange… LaunchedViaHostName ModifiedDate
LifecycleState LaunchedViaIPAddress
CurrentConnectionId IsReconnect
UserId IsSecureIca
MachineId Protocol
SessionType LogOnStartDate
IsAnonymous LogOnEndDate
CreatedDate BrokeringDuration
ModifiedDate BrokeringDate
FailureId DisconnectCode
SessionIdleTime DisconnectDate
  VMStartStartDate
  VMStartEndDate
  ClientSessionValidateDate
  ServerSessionValidateDate
  EstablishmentDate
  HdxStartDate
  HdxEndDate
  AuthenticationDuration
  GpoStartDate
  GpoEndDate
  LogOnScriptsStartDate
  LogOnScriptsEndDate
  ProfileLoadStartDate
  ProfileLoadEndDate
  InteractiveStartDate
  InteractiveEndDate
  SessionKey
  CreatedDate
  ModifiedDate

To format our query, we need to dig a little into the OData format. To start with, we have the base URL API Call. We are going to use the call for the US region for the Sessions table.

$root = "https://api-us.cloud.com/monitorodata/Sessions()"

Then, we can add in a filter expression to narrow down the query as well as Select and Expand parameters to join the Sessions table to other tables in the data. In this case, we will add a join to both to the User table and the Connection table.

$filter="(CreatedDate gt $StartDate) and (User/Username eq `'$UserName`')"

$SelectExpand=`$expand=User(`$select=username,upn),Connections(`$select=LogonStartDate,ClientName,ConnectedViaIPAddress;`$OrderBy=LogonStartDate Desc;`$top=1)”

$OrderBy=”`$OrderBy=StartDate Desc”

$Top=1

Here we are filtering by $StartDate and “UserName”. In the script, we ask for how many days back to search for, then calculate a StartDate based on that. This helps us answer the question, When was the Last Logon within the last 10 days for user “jsmith”? We then “expand” the query to include User data by selecting the username and UPN from the User table and LogonStartDate, ClientName, and ConnectedViaIPAddress from the Connection table. For the sessions, we order the records by StartDate descending then take only the top one which will be the last logon. For Connection, we do a similar thing and just select the latest connection.

You can easily change the script to capture all connections, not just the last one. When using multiple query options in the Select and Expand calls, use a semicolon to separate the options used. You can see those here, separating $select, $orderby and $top:

The $expand is like a join in SQL. There is a predefined relationship between the Session table and the User table, so we don’t need to define what field links the two. We are just saying that we want to add the User table, and the fields we want to include are username and UPN. You can add any of the fields you need using the “$select=” statement or all of them by omitting the ‘$select=”, which will then just include all fields in the table.

Possible navigation query options include:

$filter, $select, $orderby, $skip, $top, $search, and $expand.

OData Logical operators include:

  • eq,ne
  • gt, lt, ge, le
  • and, or
  • not, has

Learn more about logical operators and OData queries.

The actual query will look like the following:

Note that the question mark separates the base URL from the optional expressions. Then the ampersands separate the different options within the call — here, $filter, $expand, $orderby, and $top.

In the script, I use Invoke-WebRequest to make the API calls:

$Header = @{“Authorization”=”$token”;”Citrix-CustomerId”=”$CustomerId”}

$response = Invoke-WebRequest -Uri $uri -Header $Header

When you make the call, the header must have the required information for the API. For the Monitor API calls, we need to include the “bearer token” and the Citrix Cloud CustomerId. I will show a little later how to get these.

This query returns a single JSON record with the Session and User data for the last logon.

PowerShell has a handy way to turn JSON into an object:

# extract what we need to return from the response

$JSONResponse = convertfrom-json  $response.content

The resulting object looks like this:

This makes it easy to access the data. We set:

$r=$JSONResponse.Value

Then we can get the StartDate and EndDate by just using $r.StartDate and $r.EndDate. In the script, we are creating an object to store the results we want to return from our API call. Note how the User information is handled where “User” is added to the $r.

Also notice that we must convert the date/times provided by the API, which are stored in UTC. To handle this, I found a function to perform the conversion called Convert-UTCLocal. The function that performs that conversion is shown here:

It will convert to the time zone defined on the system running the script. (See the Get-WmiObject win32_timezone.)

That covers the meat of the API call configuration for the script. Logically, the script uses a source CSV file to feed it usernames to look up, then makes an API call for each user. I won’t go into it here, but it’s pretty easy to follow in the script. The script also uses a CSV to output the last logon times.

For output, PowerShell has a cmdlet that takes an input object and creates a CSV file from it. We can use this to export our pscustomobject to a CSV after each record is returned.

Export-csv -InputObject $sessions -Path “$OutputCsvFile” -Append -force      -NoTypeInformation

This creates a CSV file like this:

If you add any new fields into the pscustomobject, they will automatically be populated into the csv file.

Authenticating

There are several ways you can script for the authentication required to use the API.

One way is to obtain a bearer token using a Citrix Cloud CustomerId, ClientId, and ClientSecret. I have included a function in the script to get the bearer token using these parameters.

To obtain the CustomerID, ClientID, and Client Secret, follow these directions. The disadvantage to using this process is that these very sensitive keys will be stored in your script, or you will need to find a way to encrypt them.

The DaaS Remote PowerShell SDK has a way to obtain the token interactively. I have included this in the script to provide an interactive logon whenever the script is run.

The script will use the “Citrix.Sdk.Proxy.V1” snapin and Get-XDAuthentication to prompt the user for credentials. Then the bearer token is pulled from a global variable set by this command, as seen below:

Notice we can also get the CustomerId from a GLOBAL variable after using the SDK to authenticate.

The format of the bearer token used for the API calls needs to be:

$token = “CwsAuth Bearer=”+$bearerToken

If you use the SDK, the “CwsAuth Bearer=” is added automatically. If you use the GetBearerToken function, it must be added in the script as shown above ($token=).

It is also possible to use the Citrix DaaS SDK to store encrypted credentials that can only be retrieved using the same Active Directory account on the same machine. This allows for storing the credentials, then recalling them without including the sensitive keys in the script (as covered in this blog post).

To use this method, install the DaaS Remote PowerShell SDK, then save a profile using this command:

Set-XDCredentials -CustomerId $CustomerId -APIKey $clientId -SecretKey $clientSecret -StoreAs “CitrixCloud” -ProfileType CloudApi

Then it’s easy to get the bearer token. Just run this in the authentication section of the script:

Get-XDAuthentication -ProfileName “CitrixCloud” -Verbose

This allows the script to obtain the token from a global variable as shown above in the interactive logon method.

Please note, the bearer token is only good for one hour. If you have scripts that run longer than an hour you need to keep track of the time and refresh the token before the hour is up.

Data Granularity and Retention

Citrix Monitor stores and aggregates different types of data with different granularity. It is important to know the data granularity to understand the results you will receive when using the Monitor API to retrieve data.

For example, the SessionMetrics data, which includes session RTT is only available for one day while the sessions themselves are available, by default, for 90 days if you have Premium and 31 days with Advanced. Learn more about Monitor Data Retention.

Pagination

PowerShell Script: AllSessions.ps1

The monitor API returns 100 records per call, so if your script is designed to query more than 100 records, you’ll need to modify the script to handle pagination.

I created a different example script to show how this works, and it gets all the sessions from the last x number of days.

To handle this case, we need to perform a slightly different API call the first time. The results of that call will provide a URI to use for each subsequent call for more data. To handle this, we set up two variables: $KeepLooking, which controls whether or not we keep asking for more data in a loop, and $NextURI, which is used to specify the next call for more data. The first time through the loop, the $NextURI will be null so the initial URI will be used. If more than 100 records are returned, the response will include a value for ‘@odata.nextLink’ that is the URI to use to get the next 100 records. If fewer than 100 records are returned, this value will be NULL and we will exit the loop by setting $KeepLooking to $false. This logic is shown below:

The result? The script will keep querying for data if there are more records to pull.

This is an example of the ‘@odata.nextLink’. Note the “&$skip=100” is added to the end of the URI. Pagination works by incrementing that value.

Permissions

The permission required in Citrix Cloud to run the OData queries against the Monitor API are at minimum Read Only Administrator. See our Data Access Privilege documentation for more details.

Example Scripts Setup

I am including several example scripts with this blog post to help jumpstart anyone who needs to interface with the Monitor API using PowerShell.

To use the scripts, download them and unblock them in their Windows file properties.

Each script has Script Setup Parameters. If you want to be prompted to authenticate each time you run the script, set the $UseInteractiveAuth = $true (see 1 below). If you want to use either the Auth Profile with the DaaS Remote SDK or an API Key (see 2 below) set this to $false and configure the settings as described in the Authenticating section.

Then just try the script. You can save different versions and modify the OData calls to suit your needs.

Additional Examples

Earlier in this blog post, I showed examples of configuring the OData calls to get session and user information. The following items were discussed at the outset of the blog.

Session RTT

PowerShell Script: SessionRTT.ps1

Here we are adding the “SessionMetrics” table along with the “User” table. The RTT is only available for one day and is returned as an array within each Session record.

This turned out to be more complicated than I thought because I originally tried to just include the SessionMetrics table with the call for sessions. The issue? There can be more than 100 Metrics rows returned, and you need a way to iterate just through the metrics. The odata.@nextlink did not work in this case. I changed the script to get all the sessions, and for each session I got the SessionMetrics in a loop as seen below.

Please note, we are searching within the SessionMetrics table where the SessionId equals our SessionKey. The initial query will return up to 100 records. If there are more than 100 records, we use the @odata.nextlink to get the next 100 (and so on).

I have included a script called SessionRTT.ps1 on the download to show how this works.

The resulting CSV will look like this:

Please note, for sessions over a day in the past, there is no RTT information.

Be aware, if the query is running against a significant number of sessions, it will take an extended amount of time to run. It’s best to use this information for a defined set of sessions rather than all sessions.

Machine CPU and Memory Usage During a User Session

PowerShell Script: SessionMachineUtilization.ps1

In this example, I am looking up the sessions for a given user, then getting the resource utilization for the machine the session was on while the user was connected. This would be useful when a user complains of poor performance during a session, and you want to see if the VDA was overutilized at the time. Of course, if you have your VDAs instrumented with a monitoring solution, that is a much better place to capture this type of data. Also, this is the type of data that you want to retrieve for a targeted set of sessions because querying all session is intensive on the systems.

The CPU and Memory information is stored for VDAs using different time intervals. The Resource Utilization data is stored in five minutes intervals as raw data.

In this script, I look up the user’s sessions, then for each session query for resource utilization for the machine used in the session after the SessionStartDate and before the SessionEndDate.

Click image to view larger.

The records returned from the query look like the following.

These list the PercentCPU and UsedMemory at five-minute intervals.

In the end, the CSV file (SessionUtilizationStats.csv) shows each session along with the CPU and Memory every five minutes.

Session Connection Failures

PowerShell Script: ConnectionFailures.ps1

In this script, we use the ConnectionFailureLogs API call to obtain all the session connection failures for the last x days. We add in the Session, Machine, and User Tables, and within Machine we expand to get the Catalog Name and Delivery Group Name.

The OData settings are as follows:

See how in the expand is configured for machines, we use a secondary expand for the DesktopGroup and Catalog. This is because the Machine table only has the DesktopGroupId and CatalogId and we wanted to include names.

Here is an example of the data returned when all machines were in maintenance mode. The Failure type, in this case, is 100, which maps to “No Machines Available”. We also get the Session, Machine and User data.

Then, within Machine, see the DesktopGroup and Catalog:

In Monitor it looked like this

The script captures relevant field data, as show below.

There are many more datapoints available. I included the SessionStartDate and SessionEndDate because when a logon banner is used, if users wait three minutes to log on, an error is generated as a type of “Other” and the only way to know is that the Start and End dates are exactly three minutes apart.

The resulting CSV file (ConnectionFailures.csv) would look as follows. Please note, I have removed the username for security.

I searched for and found the failure codes matching the enum values in the API Reference.

I used this to create a lookup table for Failure Codes to convert from the numeric key to the actual name.  The CSV is called SessionFailureCodes.csv, and it should be place in the same folder as the script.

Error Code Enum value Description
0 Unknown Unknown
1 None None (no failure)
2 SessionPreparation Failure during session preparation
3 RegistrationTimeout Failure due to registration timeout
4 ConnectionTimeout Failure due to connection timeout
5 Licensing Failure due to licensing
6 Ticketing Failure due to ticketing
7 Other Other failure reasons
8 GeneralFail General failure
9 MaintenanceMode Desktop group, Machine or Hypervisor is in maintenance mode
10 ApplicationDisabled Selected application is currently disabled
11 LicenseFeatureRefused Required feature is not licensed
12 NoDesktopAvailable No machine is available to satisfy launch
13 SessionLimitReached VDI machine is already in use (not used for RDS)
14 DisallowedProtocol Requested protocol is not allowed
15 ResourceUnavailable Resource is unavailable
16 ActiveSessionReconnectDisabled Active session stealing is required, but is disabled
17 NoSessionToReconnect Session to which reconnect is directed is not found (can only occur on a launch retry)
18 SpinUpFailed Failed to power-up machine for launch
19 Refused Session refused
20 ConfigurationSetFailure Configuration set failure
21 MaxTotalInstancesExceeded App launch refused because limit on total concurrent usage is reached
22 MaxPerUserInstancesExceeded App launch refused because limit on per-user usage is reached
23 CommunicationError Launch failed because the VDA could not be contacted
24 MaxPerMachineInstancesExceeded App launch refused because limit on per machine usage is reached
25 MaxPerEntitlementInstancesExceeded Desktop launch refused because limit on per entitlement usage is reached
100 NoMachineAvailable No machine available
101 MachineNotFunctional Machine not functional

Conclusion

I found developing these example scripts and writing this blog post both challenging and a lot of fun. I hope they will help all our customers develop their own custom reports and event monitors when moving to the Citrix Cloud Control plane. If you use Monitor and OData for interesting uses cases, please share in the comments and let others know what you’re doing. I think we will all be better off by spreading the word.

The Scripts

You can get the scripts here. They have been developed in my lab environment and not heavily tested. Please test them yourself before trying to use them. Citrix is not responsible for these scripts. Read the following carefully before using the scripts.

This software application is provided to you as is with no representations, warranties or conditions of any kind. You may use and distribute it at your own risk. CITRIX DISCLAIMS ALL WARRANTIES WHATSOEVER, EXPRESS, IMPLIED, WRITTEN, ORAL OR STATUTORY, INCLUDING WITHOUT LIMITATION WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NONINFRINGEMENT. Without limiting the generality of the foregoing, you acknowledge and agree that (a) the software application may exhibit errors, design flaws or other problems, possibly resulting in loss of data or damage to property; (b) it may not be possible to make the software application fully functional; and (c) Citrix may, without notice or liability to you, cease to make available the current version and/or any future versions of the software application. In no event should the code be used to support of ultra-hazardous activities, including but not limited to life support or blasting activities. NEITHER CITRIX NOR ITS AFFILIATES OR AGENTS WILL BE LIABLE, UNDER BREACH OF CONTRACT OR ANY OTHER THEORY OF LIABILITY, FOR ANY DAMAGES WHATSOEVER ARISING FROM USE OF THE SOFTWARE APPLICATION, INCLUDING WITHOUT LIMITATION DIRECT, SPECIAL, INCIDENTAL, PUNITIVE, CONSEQUENTIAL OR OTHER DAMAGES, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. You agree to indemnify and defend Citrix against any and all claims arising from your use, modification or distribution of the code.