Second Life of a Hungarian SharePoint Geek

July 23, 2015

Managing Project Server Views via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 07:17

If you would like to manage Project Server views from code you will find very few helpful resources (if any) on the web. The object models simply do not include classes related to this (neither on the server side nor on the client side). Although the PSI contains a View service, it is intended for internal use. Of course, that intention could not stop us to use the service at our own risk. Below I give you some useful code samples to illustrate the usage of the View service.

First of all, we create the proxy assembly, load the required Microsoft.Office.Project.Server.Library assembly in the process as well, and define some shortcuts to make it easier to reference enum and property values later on.

$pwaUrl = "http://YourProjectServer/pwa"
$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/PSI/View.asmx?wsdl") -UseDefaultCredential
$ViewConstants = [Microsoft.Office.Project.Server.Library.ViewConstants]
$ViewType = [Microsoft.Office.Project.Server.Library.ViewConstants+ViewType]

If you now the unique ID of your view, it is easy to display all of the fields and security categories associated with the view:

$viewId = [Guid]"63d3499e-df27-401c-af58-ebb9607beae8"
$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields | % { $_.CONV_STRING }
$view.SecurityCategoryObjects | % { $_.WSEC_CAT_NAME }

If the view ID is unknown, you can get it based on the name and type of the view:

$viewName = "Your Report"
$viewType = $ViewType::PORTFOLIO

$views = $svcPSProxy.ReadViewSummaries()
$viewId = ($views.ViewReports | ? { $_.WVIEW_NAME -eq $viewName -and $_.WVIEW_TYPE -eq $viewType }).WVIEW_UID

You can list all of the views:

$views = $svcPSProxy.ReadViewSummaries()
$views.ViewReports | % {
  Write-Host $_.WVIEW_NAME ($ViewType$_.WVIEW_TYPE)

To change the order of the first two fields in the view:

$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields[0].WVIEW_FIELD_ORDER = 1
$view.ViewReportFields[1].WVIEW_FIELD_ORDER = 0

To change the order of two arbitrary fields (based on their name) in the view:

$fieldName1 = "Finish"
$fieldName2 = "Owner"
$view = $svcPSProxy.ReadView($viewId)
$field1 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName1 }
$field2 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName2 }
$field1Order = $field1.WVIEW_FIELD_ORDER
$field2Order = $field2.WVIEW_FIELD_ORDER
$field1.WVIEW_FIELD_ORDER = $field2Order
$field2.WVIEW_FIELD_ORDER = $field1Order

To remove a field from a view:

$fieldToRemoveName = "Ende"
$view = $svcPSProxy.ReadView($viewId)
$fieldToRemove = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldToRemoveName }

To delete the view itself:


To create a new view using an existing view as a template:

$newViewName = "New View"
[Void]$svcPSProxy.CopyViewReports($viewId, $newViewName)
$newView = $svcPSProxy.ReadViewSummaries().ViewReports | ? { $_.WVIEW_NAME -eq $newViewName -and $_.WVIEW_TYPE -eq $viewType }

To list all of the fields available in a given type (in this case, for tasks):

$svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | % { $_.CONV_STRING }

To append a new field at the end of the fields in the view:

$fieldToAppendName = "% Work Complete"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }
$view = $svcPSProxy.ReadView($viewId)
$maxFieldOrder = ($view.ViewReportFields | % { $_.WVIEW_FIELD_ORDER } | measure -Maximum).Maximum

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WTABLE_UID = $fieldToAppend.WTABLE_UID
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $maxFieldOrder + 1
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value


To inject a new field in the view before another field having a specified name:

$fieldInjectBeforeName = "% Complete"
$fieldToInjectName = "% Work Complete"

$fieldToInject = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToInjectName }

$view = $svcPSProxy.ReadView($viewId)

$fieldInjectBeforeOrder = ($view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldInjectBeforeName }).WVIEW_FIELD_ORDER

$view.ViewReportFields | ? { $_.WVIEW_FIELD_ORDER -ge $fieldInjectBeforeOrder } | % { $_.WVIEW_FIELD_ORDER++ }

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToInject.WFIELD_UID
$newField.CONV_STRING = $fieldToInject.CONV_STRING
$newField.WTABLE_UID = $fieldToInject.WTABLE_UID
$newField.WFIELD_NAME_SQL = $fieldToInject.WFIELD_NAME_SQL
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $fieldInjectBeforeOrder
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value


The last code sample shows how to create a new Gantt-view from scratch, appending a single field and a single security category to it:

$viewRepDS = New-Object PSIProxy.PWAViewReportsDataSet
$newView = $viewRepDS.ViewReports.NewViewReportsRow()
$newView.WVIEW_UID = [Guid]::NewGuid()
$newView.WVIEW_NAME = "New Report 2"
$newView.WVIEW_DESCRIPTION = "Test report description"

$fieldToAppendName = "% Arbeit abgeschlossen"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }

$newField = $viewRepDS.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WVIEW_UID = $newView.WVIEW_UID
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value

$newSecCat = $viewRepDS.SecurityCategoryObjects.NewSecurityCategoryObjectsRow()
$newSecCat.WSEC_CAT_UID = [Microsoft.Office.Project.Server.Library.PSSecurityCategory]::MyProjects
$newSecCat.WSEC_OBJ_TYPE_UID = [Microsoft.Office.Project.Server.Library.PSSecurityObjectType]::View
$newSecCat.WSEC_OBJ_UID = $newView.WVIEW_UID

$newView.WVIEW_TYPE = $ViewType::PORTFOLIO
$newView.WGANTT_SCHEME_UID =  $ViewConstants::GanttSchemeUidProjectCenter
#  Group by (see [pub].[MSP_WEB_GROUP_SCHEMES] table in Project DB for possible values)
$newView.WGROUP_SCHEME_UID = [Guid]::Empty


July 22, 2015

Create Project Server Enterprise Custom Fields via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 22:38

Last year I already wrote about how one can manage the Project Server Enterprise Custom Fields via the Managed Client Object Modell. We could transfer the code samples of that post from C# to PowerShell, but because of the limitations of the Managed Client Object Modell I use the PSI interface instead in this case. What are those limitations? Not all of the properties available in PSI are exposed by the Client OM, see for example the MD_PROP_SUMM_GRAPHICAL_INDICATOR field, that we can use to set the rules of graphical indicators defined for the fields. I’ll show you an example for getting and setting the indicator rules in a later post, in the current one I only show you the technique we can use to create the Enterprise Custom Fields via PSI.

One can find an existing description with code sample in Step 3 and 4 of this post, that achieves the same goal, however, I don’t like that approach for several reasons, for example, because of  we have to generate the proxy assembly based on the WSDL in the code itself. Instead of that I find the following code much more simple:


$pwaUrl = "http://YourProjectServer/pwa"

# create shortcuts
$PSDataType = [Microsoft.Office.Project.Server.Library.PSDataType]
$Entities = [Microsoft.Office.Project.Server.Library.EntityCollection]::Entities

$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/psi/CustomFields.asmx?wsdl") -UseDefaultCredential

$customFieldDataSet = New-Object PSIProxy.CustomFieldDataSet 

$customFieldRow = $customFieldDataSet.CustomFields.NewCustomFieldsRow()   
$customFieldRow.MD_PROP_UID = [Guid]::NewGuid()
$customFieldRow.MD_PROP_NAME = "Custom Project Field"
$customFieldRow.MD_PROP_TYPE_ENUM = $PSDataType::STRING
$customFieldRow.MD_ENT_TYPE_UID = $Entities.ProjectEntity.UniqueId
$customFieldRow.MD_PROP_IS_REQUIRED = $false
$customFieldRow.MD_PROP_IS_LEAF_NODE_ONLY = $false
$customFieldRow.MD_PROP_DESCRIPTION = "Test Field Desc."

$svcPSProxy.CreateCustomFields($customFieldDataSet, $false, $true)

If you have casting issues when using the Namespace parameter of the New-WebServiceProxy cmdlet, you should read this post.

Creating a PowerShell-based Monitoring and Alerting System for Project Server

Filed under: PowerShell, Project Server — Tags: , — Peter Holpar @ 22:08

A few months ago I published a post about how to find the jobs in the Project Server queue programmatically. In the current post I will show you, how can you use PowerShell to track the number of jobs in queue, and send an e-mail alert, if the count is higher than a predefined limit for a longer period. Although the example in this post is Project Server specific, you can use the same technique to create other types of alerts as well.

Since the PowerShell script will be run by Windows Task Scheduler (for example on a 5-minute schedule) it was an important question, how to solve the communication between the runs. For example, how the current session can find out, since when the counter is higher than the limit? Of course, if the limit is reached, and we have already sent a mail, we would not like to send further mails for every and each runs till the counter is higher than the limit. But how to inform the forthcoming sessions from the current session, that we have sent a mail? Of course, there are many possible solutions for this problem. We could use a database, or a file (either XML or any custom format) to persist the information between the sessions. I’ve chosen an even simpler approach. I’ve create empty files (QueueLimitReached.txt and MailSent.txt), and check their existence and / or creation date to check when the limit has been reached and if the alert mail has been already sent. If the counter goes below the limit again, I simply delete these semaphore files.

Having this background, the script itself should be already straightforward.

  1. Add-PSSnapin "Microsoft.SharePoint.PowerShell"
  3. $folderPath = "D:\ScheduledTasks\"
  4. $limitReachedFileName = "QueueLimitReached.txt"
  5. $mailSentFileName = "MailSent.txt"
  6. $ageOfFileLimit = 15 # in minutes
  7. $counterValueLimit = 50
  9. $emailTo = ""
  10. $emailCc = ";"
  11. $emailSubject = "Project Server Queue Alert"
  12. $emailBody = @"
  13. Hi,
  15. the count of the jobs in the Project Server Queue is very high. Please, fix the issue!
  17. Regards,
  18. The PowerShell Monitor
  19.   "@
  21. $limitReachedFilePath = $folderPath + $limitReachedFileName
  22. $mailSentFilePath = $folderPath + $mailSentFileName
  24. function HasAlertState()
  25. {
  26.   $counter = Get-Counter -Counter "\ProjectServer:QueueGeneral(_Total)\Current Unprocessed Jobs"
  27.   $counterValue = $counter.CounterSamples[0].CookedValue
  28.   return ($counterValue -gt $counterValueLimit)
  29. }
  31. function SendAlert()
  32. {   
  33.   $globalAdmin = New-Object Microsoft.SharePoint.Administration.SPGlobalAdmin
  35.   $smtpMail = New-Object Net.Mail.MailMessage
  36.   $smtpMail.From = $globalAdmin.MailFromAddress
  37.   $smtpMail.Subject = $emailSubject
  38.   $smtpMail.Body = $emailBody
  39.   $emailTo.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.To.Add($mailAddr) }
  40.   $emailCc.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.Cc.Add($mailAddr) }
  41.   $smtpMail.ReplyTo = New-Object Net.Mail.MailAddress($globalAdmin.MailReplyToAddress)
  42.   $smtpMail.BodyEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  43.   $smtpMail.SubjectEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  45.   $smtpClient = New-Object Net.Mail.SmtpClient($globalAdmin.OutboundSmtpServer)
  46.   $smtpClient.Send($smtpMail)
  47. }
  49. $alertCondition = HasAlertState
  51. If ($alertCondition)
  52. {
  53.   If (Test-Path $limitReachedFilePath)
  54.   {
  55.     $creationTime = (Get-ChildItem $limitReachedFilePath).CreationTime
  56.     $ageOfFile = ([DateTime]::Now – $creationTime).Minutes
  57.     Write-Host $ageOfFile
  58.     If ($ageOfFile -gt $ageOfFileLimit)
  59.     {
  60.       Write-Host Limit reached
  61.       If (-not (Test-Path $mailSentFilePath))
  62.       {
  63.         Write-Host Mail has not yet been sent. Send it now.
  64.         SendAlert
  65.         # suppress return value via casting it to null
  66.         [void] (New-Item -name $mailSentFileName -path $folderPath -itemType File)
  67.       }
  68.     }
  69.   }
  70.   # create a new file, if no former one exists
  71.   else
  72.   {
  73.     If (-not (Test-Path $limitReachedFilePath))
  74.     {
  75.       # suppress return value via casting it to null
  76.       [void] (New-Item -name $limitReachedFileName -path $folderPath -itemType File)
  77.     }
  78.   }
  79. }
  80. # delete the former files, if they exist
  81. Else
  82. {
  83.   If (Test-Path $limitReachedFilePath)
  84.   {
  85.     Remove-Item $limitReachedFilePath
  86.   }
  87.   If (Test-Path $mailSentFilePath)
  88.   {
  89.     Remove-Item $mailSentFilePath
  90.   }
  91. }

In the sample we check the value of the Current Unprocessed Jobs counter of Project Server. You can easily change the limit of  job count (50), and the time period (15 minutes) in the code, or customize the addressees, subject and body of the mail. If you would like to create other types of alerts, you should simply implement your own version of the HasAlertState method.

July 19, 2015

How to restrict the available users in a ‘Person or Group’ field to Project Server resources?

Filed under: Managed Client OM, PowerShell, PS 2013, SP 2013 — Tags: , , , — Peter Holpar @ 21:33

Assume you have a task list in your Project Web Access (PWA) site or on one of the Project Web Sites (PWS) in your Project Server and you would like to restrict the users available in the Assigned To field (field type of  ‘Person or Group‘) to users who are specified as Enterprise Resource in Project Server, that is running in the “classical” Project Server permission mode, and not in the new SharePoint Server permission mode. There is no synchronization configured between Active Directory groups and Project Server resources.

You can limit a ‘Person or Group‘ field to a specific SharePoint group, but there is no built-in solution to sync enterprise resources to SharePoint groups. In this post I show you, how to achieve that via PowerShell and the managed client object models of Project Server and SharePoint.

Note: You could get the login names of users assigned to the enterprise resources via REST as well (http://YourProjectServer/PWA/_api/ProjectServer/EnterpriseResources?$expand=User&$select=User/LoginName), but in my sample I still use the client object model of  Project Server.

My goal was to create a PowerShell solution, because it makes it easy to change the code on the server without any kind of compiler. I first created a C# solution, because the language elements of C# (like extension methods, generics and LINQ) help us to write compact, effective and readable code. For example, since the language elements of PowerShell do not support the LINQ expressions, you cannot simply restrict the elements and their properties returned by a client object model request, as I illustrated my former posts here, here and here. Having the working C# source code, I included it in my PowerShell script as literal string and built the .NET application at runtime, just as I illustrated in this post. In the C# code I utilized an extension method to help automated batching of the client object model request. More about this solution can be read here.

The logic of the synchronization is simple: we read the list of all non-generic enterprise resources, and store the login names (it the user exists) as string in a generic list. Then read the members of the SharePoint group we are synchronizing and store their login names as string in another generic list. Finally, we add the missing users to the SharePoint group and remove the extra users from the group.

The final code is included here:

  1. $pwaUrl = "http://YourProjectServer/PWA";
  2. $grpName = "AllResourcesGroup";
  4. $referencedAssemblies = (
  5.     "Microsoft.SharePoint.Client, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  6.     "Microsoft.SharePoint.Client.Runtime, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  7.     "Microsoft.ProjectServer.Client, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  8.     "System.Core, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089")
  10. $sourceCode = @"
  11. using System;
  12. using System.Linq;
  13. using System.Collections.Generic;
  14. using Microsoft.SharePoint.Client;
  15. using Microsoft.ProjectServer.Client;
  17. public static class Extensions
  18. {
  19.     // batching to avoid
  20.     // Microsoft.SharePoint.Client.ServerException: The request uses too many resources
  21.     //
  22.     public static void ExecuteQueryBatch<T>(this ClientContext clientContext, IEnumerable<T> itemsToProcess, Action<T> action, int batchSize)
  23.     {
  24.         var counter = 1;
  26.         foreach (var itemToProcess in itemsToProcess)
  27.         {
  28.             action(itemToProcess);
  29.             counter++;
  31.             if (counter > batchSize)
  32.             {
  33.                 clientContext.ExecuteQuery();
  34.                 counter = 1;
  35.             }
  36.         }
  38.         if (counter > 1)
  39.         {
  40.             clientContext.ExecuteQuery();
  41.         }
  42.     }
  43. }
  45. public static class Helper
  46. {
  47.     public static void SyncGroupMembers(string pwaUrl, string grpName)
  48.     {
  49.         List<string> resLogins = new List<string>();
  50.         List<string> grpLogins = new List<string>();
  51.         var batchSize = 20;
  53.         using (var projectContext = new ProjectContext(pwaUrl))
  54.         {
  55.             var resources = projectContext.EnterpriseResources;
  56.             projectContext.Load(resources, rs => rs.Where(r => !r.IsGeneric).Include(r => r.User.LoginName));
  58.             projectContext.ExecuteQuery();
  60.             resLogins.AddRange(resources.ToList().Where(r => r.User.ServerObjectIsNull == false).ToList().Select(r => r.User.LoginName.ToLower()));               
  61.         }
  62.         using (var clientContext = new ClientContext(pwaUrl))
  63.         {
  64.             var web = clientContext.Web;
  66.             var grp = web.SiteGroups.GetByName(grpName);
  67.             clientContext.Load(grp, g => g.Users.Include(u => u.LoginName));
  69.             clientContext.ExecuteQuery();
  71.             grpLogins.AddRange(grp.Users.ToList().ToList().Select(u => u.LoginName.ToLower()));
  73.             var usersToAdd = resLogins.Where(l => !grpLogins.Contains(l));
  74.             clientContext.ExecuteQueryBatch<string>(usersToAdd,
  75.                 new Action<string>(loginName =>
  76.                 {
  77.                     var user = web.EnsureUser(loginName);
  78.                     grp.Users.AddUser(user);
  79.                 }),
  80.                 batchSize);
  82.             var usersToRemove = grpLogins.Where(l => !resLogins.Contains(l));
  83.             clientContext.ExecuteQueryBatch<string>(usersToRemove,
  84.                 new Action<string>(loginName =>
  85.                 {
  86.                     grp.Users.RemoveByLoginName(loginName);
  87.                 }),
  88.                 batchSize);
  89.         }
  90.     }
  91. }
  93. "@
  94. Add-Type -ReferencedAssemblies $referencedAssemblies -TypeDefinition $sourceCode -Language CSharp;
  96. [Helper]::SyncGroupMembers($pwaUrl, $grpName)

How to avoid ‘The request uses too many resources’ when using the client object model via automated batching of commands

Filed under: Managed Client OM, SP 2013 — Tags: , — Peter Holpar @ 21:32

One of the reasons, I prefer the client object model of SharePoint to the REST interface, is its capability of batching requests.

For example, you can add multiple users to a SharePoint group using the code below, and it is sent as a single request to the server:

  1. using (var clientContext = new ClientContext(url))
  2. {
  3.     var web = clientContext.Web;
  4.     var grp = web.SiteGroups.GetByName("YourGroup");
  6.     var usersToAdd = new List<string>() { @"i:0#.w|domain\user1", @"i:0#.w|domain\user2" };
  8.     foreach (var loginName in usersToAdd)
  9.     {
  10.         var user = web.EnsureUser(loginName);
  11.         grp.Users.AddUser(user);
  12.     }
  14.     clientContext.ExecuteQuery();
  15. }

However, as the number of  users you would like to add increases, you might have issues, as the operational requests in your batch are exceeding the 2 MB limit.

How could we solve the problem relative painless, avoiding the error, and still keeping our code readable?

The good news is that it is easy to achieve using extension method, generic, and the Action class. We can extend the ClientContext with an ExecuteQueryBatch method, and pass the list of parameter values to be processed in an IEnumerable, the action to be performed, and the count of  items should be processed in a single batch. The method splits the parameter values into batches, calling the ExecuteQuery method on the ClientContext for each batch.

If the action, you would perform on the client objects has a single parameter (as in our case above, the login name is a single parameter of type String), the ExecuteQueryBatch method can be defined as:

  1. public static class Extensions
  2. {
  3.     public static void ExecuteQueryBatch<T>(this ClientContext clientContext, IEnumerable<T> itemsToProcess, Action<T> action, int batchSize)
  4.     {
  5.         var counter = 1;
  7.         foreach (var itemToProcess in itemsToProcess)
  8.         {
  9.             action(itemToProcess);
  11.             counter++;
  12.             if (counter > batchSize)
  13.             {
  14.                 clientContext.ExecuteQuery();
  15.                 counter = 1;
  16.             }
  17.         }
  19.         if (counter > 1)
  20.         {
  21.             clientContext.ExecuteQuery();
  22.         }
  23.     }
  24. }

Having the ExecuteQueryBatch method in this form, the original code can be modified:

  1. var batchSize = 20;
  3. using (var clientContext = new ClientContext(url))
  4. {
  5.     var web = clientContext.Web;
  6.     var grp = web.SiteGroups.GetByName("YourGroup");
  8.     var usersToAdd = new List<string>() { @"i:0#.w|domain\user1", @"i:0#.w|domain\user2" /* and a lot of other logins */ };
  10.     clientContext.ExecuteQueryBatch<string>(usersToAdd,
  11.         new Action<string>(loginName =>
  12.         {
  13.             var user = web.EnsureUser(loginName);
  14.             grp.Users.AddUser(user);
  15.         }),
  16.         batchSize);
  18.     clientContext.ExecuteQuery();
  19. }

The size of batch you can use depends on the complexity of the action. For a complex action should be the batch smaller. The ideal value should you find experimentally.

Actions with multiple parameter require additional overloads of the the ExecuteQueryBatch extension method.

In my next post I’ll illustrate how to utilize this extension method in a practical example.

July 16, 2015

How to Read Project Properties that are not Available in the Client Object Model?

Recently I had a development task that at the first sight seemed to be trivial, but it turned out quickly to be rather a challenge. I had to display some basic project information on a page in our Project Web Site (PWS), like project start and finish date, remaining work and percent complete. The web page was built using client-side technologies, like the Client-side object model (CSOM) for Project 2013 and using the AngularJS library, and we did not plan to change the architecture to server side code.

If you check the properties of the PublishedProject (either on the client side in namespace / assembly Microsoft.ProjectServer.Client or on the server side in Microsoft.ProjectServer), you see that it has properties like StartDate and FinishDate, and it inherits its PercentComplete property from the Project base class, however there is no property for RemainingWork or PercentWorkComplete defined, although both of these values are available as fields if you manage a Project Server view (see screenshot below). This information is not available via REST / OData either.


You should know, that in the case of  Project Server, the server side OM is simply a wrapper around the PSI, for example, the PercentComplete property in the Project class is defined:

public int PercentComplete
    ProjectDataSet.TaskRow summaryTaskRow = this.SummaryTaskRow;
    if (summaryTaskRow != null && !summaryTaskRow.IsTASK_PCT_COMPNull())
      return summaryTaskRow.TASK_PCT_COMP;
      return 0;

Client side OMs (either managed or ECMAScript) and REST calls invoke the server side OM, so at the background the good old PSI is still in action.

It seems that the developers of Project Server remained simply not enough time to map all of the fields available via PSI to the object models on the server side and the client side.

You should know either, that the project properties we need are stored as task properties for the project summary task of the current project. In the Project Server database the tasks of the published projects (so the project summary tasks as well) are stored in the [pub].[MSP_TASKS] table. If you run the following query (where ProjectWebApp is the name of the database and the Guid in the [PROJ_UID] filter is the ID of your project), you find some specific field values that may help to identify the summary task record of a project:

  FROM [ProjectWebApp].[pub].[MSP_TASKS]
  WHERE [PROJ_UID] = ‘d0ae5086-be7a-e411-9568-005056b45654′

The project summary task record – at least, based on my experimental results – , matches the following conditions:

[TASK_ID] = 0




But as said, we need a solution on the client side, and obviously one that does not tamper with the Project Server database. What options are there to achieve the missing information?

The Project class has a property called SummaryTaskId, but if you have this value already, and would like to query the project tasks via REST (for example: http://YourProjServer/PWA/_api/ProjectServer/Projects(‘d0ae5086-be7a-e411-9568-005056b45654&#8242;)/Tasks(‘FFAE5086-BE7A-E411-9568-005056B45654′)) or via the client object model, the result is empty. The description of the SummaryTaskId property says: “Gets the GUID for the hidden project summary task”. Yes, it is so hidden, that it simply not included in the Tasks collection of the Project class! The Tasks property of the PublishedProject class is of type PublishedTaskCollection, and on the server side the record for the project summary task is simply filtered out, when initializing the internal Dictionary used for the storage of the Task records. If you don’t believe me, or need more details on that, see the constructor method of Microsoft.ProjectServer.PublishedTaskCollection class below:

internal PublishedTaskCollection()
    Func<Dictionary<Guid, PublishedTask>> valueFactory = null;
    if (valueFactory == null)
        valueFactory = () => base.ProjectData.Task.OfType<ProjectDataSet.TaskRow>().Where<ProjectDataSet.TaskRow>(delegate (ProjectDataSet.TaskRow r) {
            if (!r.IsTASK_PARENT_UIDNull())
                return (r.TASK_PARENT_UID != r.TASK_UID);
            return true;
        }).ToDictionary<ProjectDataSet.TaskRow, Guid, PublishedTask>(r => r.TASK_UID, r => this.CreateTask(r));
    this._tasks = new Lazy<Dictionary<Guid, PublishedTask>>(valueFactory);

Of course, we get the same, empty result if we would like to filter the tasks for one the special conditions we found in the database (like [TASK_OUTLINE_LEVEL] = 0):
http://YourProjServer/PWA/_api/ProjectServer/Projects(‘d0ae5086-be7a-e411-9568-005056b45654&#8242;)/Tasks?$filter=OutlineLevel eq 0 

The project reporting data contains the project summary tasks as well, so we could invoke the ProjectData OData endpoint from the client side to query the required information. The problem with this approach is that it would require extra permissions on the reporting data and one cannot limit this permission to the summary tasks of a specific project, to summary tasks, or just to tasks at all. If you grant your users the Access Project Server Reporting Service global permission, they can query all of the reporting data. It is sure not our goal, but you can test it if you wish.

Once you have the ID of the project summary task (for example via the SummaryTaskId property), the task is available via a query like this one:


When using PSI, we can access the required information via the TASK_REM_WORK and TASK_PCT_WORK_COMP fields in ProjectDataSet.TaskRow, that means, rows in the Task property (type of  ProjectDataSet.TaskDataTable) of the ProjectDataSet. The first row in the record set contains the information about the project summary task.

We could create our own extensions for the client object model (wrapping around just this piece of  PSI), as I illustrated for the managed, and for the ECMAScript object model as well, but it would require a lot of work, so I ignored this option for now. Instead of this, I’ve created a simple .NET console application utilizing the PSI (see the most important part of the code below). Unfortunately, I have not found a method that returns only a specific task of a specific project, so I had to call the ReadProjectEntities method to read all of the tasks of the project.

  1. _projectClient = new SvcProject.ProjectClient(ENDPOINT_PROJECT, pwaUrl + "/_vti_bin/PSI/ProjectServer.svc");
  2. _projectClient.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
  4. Guid projId = Guid.Parse("d0ae5086-be7a-e411-9568-005056b45654");
  5. int taskEntityId = 2;
  7. var projEntitiesDS = _projectClient.ReadProjectEntities(projId, taskEntityId, SvcProject.DataStoreEnum.PublishedStore);
  8. var tasksTable = projEntitiesDS.Task;
  10. foreach (SvcProject.ProjectDataSet.TaskRow task in tasksTable.Rows)
  11. {
  12.     Console.WriteLine(string.Format("TASK_OUTLINE_NUM: {0}; TASK_PCT_WORK_COMP: {1}; TASK_REM_WORK: {2}", task.TASK_OUTLINE_NUM, task.TASK_PCT_WORK_COMP, task.TASK_REM_WORK));
  13. }

I’ve captured the request and the response using Fiddler:


Then extended my JavaScript code with methods that assemble the request in the same format, submit it to the server, then parse the required fields out of the response.

First, I needed a helper method to format strings:

  1. String.format = (function () {
  2.     // The string containing the format items (e.g. "{0}")
  3.     // will and always has to be the first argument.
  4.     var result = arguments[0];
  6.     // start with the second argument (i = 1)
  7.     for (var i = 1; i < arguments.length; i++) {
  8.         // "gm" = RegEx options for Global search (more than one instance)
  9.         // and for Multiline search
  10.         var regEx = new RegExp("\\{" + (i – 1) + "\\}", "gm");
  11.         result = result.replace(regEx, arguments[i]);
  12.     }
  14.     return result;
  15. });

In my Angular controller I defined this function to format dates:

  1. $scope.formatDate = function (date) {
  2.     var formattedDate = '';
  3.     if ((typeof date != "undefined") && (date.year() > 1)) {
  4.         formattedDate = String.format("{0}.{1}.{2}", date.year(), date.month() + 1,;
  5.     }
  7.     return formattedDate;
  8. }

Next, in the controller we get the ID of the project for the current PWS, and we read project properties that are available via the client object model, and finally the ones, that are available only via PSI:

  1. var promiseWebProps = ProjService.getWebProps($scope);
  2. promiseWebProps.then(function (props) {
  3.     $scope.projectId = props.projectId;
  5.     // read the project properties that are available via the client object model
  6.     var promiseProjProp = ProjService.getProjectProps($scope);
  7.     promiseProjProp.then(function (props) {
  8.         $scope.projStartDate = moment(props.projStartDate);
  9.         $scope.projFinishDate = moment(props.projFinishDate);
  10.         $scope.percentComp = props.percentComp;
  11.     }, function (errorMsg) {
  12.         console.log("Error: " + errorMsg);
  13.     });
  15.     // read the project properties that are available only via PSI
  16.     var promiseProjPropEx = ProjService.getProjectPropsEx($scope);
  17.     promiseProjPropEx.then(function (propsEx) {
  18.         $scope.remainingWork = Math.round(propsEx.remainingWork / 600) / 100;
  19.         $scope.percentWorkComp = propsEx.percentWorkComp;
  20.     }, function (errorMsg) {
  21.         console.log("Error: " + errorMsg);
  22.     });
  24. }, function (errorMsg) {
  25.     console.log("Error: " + errorMsg);
  26. });

As you can see, the value we receive in the remainingWork property should be divided by 600 and 100 to get the value in hours.

In our custom ProjService service I’ve implemented the corresponding methods.

The project ID is stored in the property bag of the PWS in a property called MSPWAPROJUID (see this post about how to read property bags from the client object model):

  1. this.getWebProps = function ($scope) {
  2.     var deferred = $q.defer();
  4.     var ctx = SP.ClientContext.get_current();
  6.     var web = ctx.get_web();
  7.     var props = web.get_allProperties();
  8.     ctx.load(props);
  11.     ctx.executeQueryAsync(
  12.         function () {
  13.             var allProps = props.get_fieldValues();
  15.             deferred.resolve(
  16.                 {
  17.                     projectId: allProps.MSPWAPROJUID
  18.                 });
  19.         },
  20.         function (sender, args) {
  21.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  22.         }
  23.     );
  25.     return deferred.promise;
  26. };

Having the project ID, reading project properties via the client object model should be straightforward as well:

  1. this.getProjectProps = function ($scope) {
  2.     var deferred = $q.defer();
  4.     var ctx = SP.ClientContext.get_current();
  6.     var projContext = PS.ProjectContext.get_current();
  8.     projContext.set_isPageUrl(ctx.get_isPageUrl);
  9.     var proj = projContext.get_projects().getById($scope.projectId);
  10.     projContext.load(proj, "StartDate", "FinishDate", "PercentComplete");
  12.     projContext.executeQueryAsync(
  13.         function () {
  14.             deferred.resolve({
  15.                 projStartDate: proj.get_startDate(),
  16.                 projFinishDate: proj.get_finishDate(),
  17.                 percentComp: proj.get_percentComplete()
  18.             });
  19.         },
  20.         function (sender, args) {
  21.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  22.         }
  23.     );
  25.     return deferred.promise;
  26. };

Reading the ‘extra’ properties via PSI is a bit more complex. First, we assemble the request XML as we captured it with Fiddler when used the console application mentioned above, and post it to the server. Next, we process the response (see the code of the helper method buildXMLFromString farther below), and parse out the necessary properties from the project summary task (that is the Task node having rowOrder = 0) using XPath queries.

  1. this.getProjectPropsEx = function () {
  2.     var deferred = $q.defer();
  4.     // assuming your PWA is located at /PWA
  5.     var psiUrl = String.format("{0}//{1}/PWA/_vti_bin/PSI/ProjectServer.svc", window.location.protocol,;
  7.     $http({
  8.         method: 'POST',
  9.         url: psiUrl,
  10.         data: '<s:Envelope xmlns:s=""><s:Body xmlns:xsi="; xmlns:xsd=""><ReadProjectEntities xmlns=""><projectUid>d0ae5086-be7a-e411-9568-005056b45654</projectUid><ProjectEntityType>2</ProjectEntityType><dataStore>PublishedStore</dataStore></ReadProjectEntities></s:Body></s:Envelope>&#039;,
  11.         headers: {
  12.             "Content-Type": 'text/xml; charset=utf-8',
  13.             "SOAPAction": ";
  14.         }
  15.     }).success(function (data) {
  16.         var dataAsXml = buildXMLFromString(data);
  17.         dataAsXml.setProperty('SelectionLanguage', 'XPath');
  18.         dataAsXml.setProperty('SelectionNamespaces', 'xmlns:pds="; xmlns:msdata="urn:schemas-microsoft-com:xml-msdata"');
  19.         var projSumTaskNode = dataAsXml.selectSingleNode("//pds:Task[@msdata:rowOrder=0]");
  20.         var remainingWork = projSumTaskNode.selectSingleNode("pds:TASK_REM_WORK").nodeTypedValue;
  21.         var percentWorkComp = projSumTaskNode.selectSingleNode("pds:TASK_PCT_WORK_COMP").nodeTypedValue;
  22.         deferred.resolve(
  23.             {
  24.                 remainingWork: remainingWork,
  25.                 percentWorkComp: percentWorkComp
  26.             });
  27.     })
  28.     .error(function (data, status) {
  29.         deferred.reject('Request failed. ' + data);
  30.     });
  32.     return deferred.promise;
  33. }

These are the helper methods I used for processing the response text as XML:

  1. function createMSXMLDocumentObject() {
  2.     if (typeof (ActiveXObject) != "undefined") {
  3.         //
  4.         var progIDs = [
  5.                         "Msxml2.DOMDocument.6.0",
  6.                         "Msxml2.DOMDocument.3.0",
  7.                         "MSXML.DOMDocument"
  8.         ];
  9.         for (var i = 0; i < progIDs.length; i++) {
  10.             try {
  11.                 return new ActiveXObject(progIDs[i]);
  12.             } catch (e) { };
  13.         }
  14.     }
  16.     return null;
  17. }
  19. function buildXMLFromString(text) {
  20.     var xmlDoc;
  22.     xmlDoc = createMSXMLDocumentObject();
  23.     if (!xmlDoc) {
  24.         alert("Cannot create XMLDocument object");
  25.         return null;
  26.     }
  28.     xmlDoc.loadXML(text);
  30.     var errorMsg = null;
  31.     if (xmlDoc.parseError && xmlDoc.parseError.errorCode != 0) {
  32.         errorMsg = "XML Parsing Error: " + xmlDoc.parseError.reason
  33.                     + " at line " + xmlDoc.parseError.line
  34.                     + " at position " + xmlDoc.parseError.linepos;
  35.     }
  36.     else {
  37.         if (xmlDoc.documentElement) {
  38.             if (xmlDoc.documentElement.nodeName == "parsererror") {
  39.                 errorMsg = xmlDoc.documentElement.childNodes[0].nodeValue;
  40.             }
  41.         }
  42.         else {
  43.             errorMsg = "XML Parsing Error!";
  44.         }
  45.     }
  47.     if (errorMsg) {
  48.         alert(errorMsg);
  49.         return null;
  50.     }
  52.     return xmlDoc;
  53. }

Having an HTML template like this one:

  1. <div><span>% complete:</span><span>{{percentComp}}%</span></div>
  2. <div><span>% work complete:</span><span>{{percentWorkComp}}%</span></div>
  3. <div><span>Remaining work:</span><span>{{remainingWork}} Hours</span></div>
  4. <div><span>Project start:</span><span>{{formatDate(projStartDate)}}</span></div>
  5. <div><span>Project finish:</span><span>{{formatDate(projFinishDate)}}</span></div>

the result should be displayed similar to this one:


A drawback of this approach (not to mention the fact that it is pretty hacky) is, that due the ReadProjectEntities method, all of the fields of all of the project tasks should be downloaded to the client, although we need only a few fields of a single task, the project summary task. So it would make sense to implement some kind of  caching on the client side, but it is out of the scope of this post. But as long as Microsoft does not provide all the project fields in the client object model, I have not found any better solution that would require a relative small effort.

How to process the output of the Test-SPContentDatabase cmdlet

Filed under: PowerShell, Regular expressions, SP 2013 — Tags: , , — Peter Holpar @ 22:07

The Test-SPContentDatabase PowerShell cmdlet is a handy tool if you have to check the healthiness of your web application, for example, if you are preparing an upgrade, or migrating content DBs between servers.

Since its output can be pretty long, its typical usage is to redirect the output to a text file, like:

Test-SPContentDatabase -Name YourContentDB -WebApplication http://YourWebApp > F:\data\pholpar\TestSPContentDB.txt

The problem with this approach is, that the output contains a textural representation of the result. You can browse through the text, but it is problematic, if the output is huge. Furthermore, it is not suitable for direct automation, for example, if you would like to create some kind of report or statistic, or to feed a tool that should fix the issues.

I’ve tried to find the source of the information displayed by the tool in the object model, and found that the text is assembled in the components very deep in the object hierarchy, just above the data access layer, so we can not achieve it via Reflection (for example by calling private methods of internal classes) as well. If you don’t want to tamper wit the database (that is generally not recommended by Microsoft, to say the least), you should find another way.

You can parse out for example the values you need from the text using PowerShell and regular expressions. In my post below I provide you with the scripts you can use to get the information for the four main types of errors retuned by Test-SPContentDatabase: MissingFeature, MissingSetupFile, MissingAssembly and MissingWebPart.

First, instead of the redirection the output to a text file, we store all of the items returned by the Test-SPContentDatabase cmdlet (each item is of type SPContentDatabaseTestResult) in a variable:

$problems = Test-SPContentDatabase -Name YourContentDB -WebApplication http://YourWebApp

The items of category MissingFeature have the following properties.

Category        : MissingFeature
Error           : True
UpgradeBlocking : False
Message         : Database [YourContentDB] has reference(s) to a missing f
                  eature: Id = [75fa102a-b85d-4e7d-bd7f-00101b98e11e].
Remedy          : The feature with Id 75fa102a-b85d-4e7d-bd7f-00101b98e11e is r
                  eferenced in the database [YourContentDB], but is not in
                  stalled on the current farm. The missing feature may cause up
                  grade to fail. Please install any solution which contains the
                   feature and restart upgrade if necessary.

We can use the Message property to parse out the name of the content database and the Id of the feature.

$problems | ? { $_.Category -eq "MissingFeature" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "Database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Feature -Value ($_.Message | Select-String -Pattern "missing feature: Id = \[(?<feature>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["feature"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nFeature: $_.Feature`r`n`r`n }

Similarly, the items of category MissingSetupFile have this format:

Category        : MissingSetupFile
Error           : True
UpgradeBlocking : False
Message         : File [Features\PublishingLayouts\Images\gr_logo.jpg] is refer
                  enced [1] times in the database [YourContentDB], but is
                  not installed on the current farm. Please install any feature
                  /solution which contains this file.
Remedy          : One or more setup files are referenced in the database [YourC
                  ontentDB], but are not installed on the current farm. Please
                  install any feature or solution which contains these files.

The script for this type of items:

$problems | ? { $_.Category -eq "MissingSetupFile" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name RefCount -Value ($_.Message | Select-String -Pattern "is referenced \[(?<refcount>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["refcount"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name File -Value ($_.Message | Select-String -Pattern "File \[(?<filename>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["filename"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nReference count: $_.RefCount`r`nFile: $_.File`r`n`r`n }

The items of category MissingAssembly have this format:

Category        : MissingAssembly
Error           : True
UpgradeBlocking : False
Message         : Assembly [YourComp.Custom.EventReceiver, Version=,
                   Culture=neutral, PublicKeyToken=8ffc7db3dc6d3b89] is referen
                  ced in the database [YourContentDB], but is not installe
                  d on the current farm. Please install any feature/solution wh
                  ich contains this assembly.
Remedy          : One or more assemblies are referenced in the database [YourCo
                  ntentDB], but are not installed on the current farm. Please i
                  nstall any feature or solution which contains these asse

The script for this type of items:

$problems | ? { $_.Category -eq "MissingAssembly" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Assembly -Value ($_.Message | Select-String -Pattern "Assembly \[(?<assembly>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["assembly"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nAssembly: $_.Assembly`r`n`r`n }

Finally, he items of category MissingWebPart have the format:

Category        : MissingWebPart
Error           : True
UpgradeBlocking : False
Message         : WebPart class [9c7d980e-6470-94d9-db54-419b27d03880] is refer
                  enced [3] times in the database [YourContentDB], but is
                  not installed on the current farm. Please install any feature
                  /solution which contains this web part.
Remedy          : One or more web parts are referenced in the database [YourCon
                  tentDB], but are not installed on the current farm. Please in
                  stall any feature or solution which contains these web parts.

And the important attributes of these items can be parsed out via this script:

$problems | ? { $_.Category -eq "MissingWebPart" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name RefCount -Value ($_.Message | Select-String -Pattern "is referenced \[(?<refcount>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["refcount"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Id -Value ($_.Message | Select-String -Pattern "WebPart class \[(?<id>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["id"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Class -Value ($_.Message | Select-String -Pattern " \(class \[(?<class>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["class"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Assembly -Value ($_.Message | Select-String -Pattern "from assembly \[(?<assembly>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["assembly"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nReference count: $_.RefCount`r`nId: $_.Id`r`nClass: $_.Class`r`nAssembly: $_.Assembly`r`n`r`n }

July 9, 2015

Creating custom SharePoint permission levels via PowerShell

Filed under: Permissions, PowerShell, Security, SP 2010 — Tags: , , , — Peter Holpar @ 15:36

Today I had to create some custom permissions via code, however the best post I found in this theme included only C# code, and I was to use PowerShell. Although it is not very difficult to translate the code, it is not trivial as well. So I thought I share my “translation”, and hope somebody find it useful.

Copy a permission level:
$contributorRole = $web.RoleDefinitions.GetByType([Microsoft.SharePoint.SPRoleType]::Contributor)
$customRole = New-Object Microsoft.SharePoint.SPRoleDefinition($contributorRole)

Add a permission:
$customRole.BasePermissions = $customRole.BasePermissions -bor [Microsoft.SharePoint.SPBasePermissions]::CreateGroups

Add multiple permissions:
$customRole.BasePermissions = $customRole.BasePermissions -bor ([Microsoft.SharePoint.SPBasePermissions]::ApplyStyleSheets -bor [Microsoft.SharePoint.SPBasePermissions]::ApproveItems)

Add all permissions:
$customRole.BasePermissions = $customRole.BasePermissions -bor [Microsoft.SharePoint.SPBasePermissions]::FullMask

Remove a permission:
$customRole.BasePermissions = $customRole.BasePermissions -band -bnot [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems

Remove multiple permissions:
$customRole.BasePermissions = $customRole.BasePermissions -band -bnot ([Microsoft.SharePoint.SPBasePermissions]::DeleteVersions -bor [Microsoft.SharePoint.SPBasePermissions]::EditListItems)

Remove all permissions:
$customRole.BasePermissions = $customRole.BasePermissions -band [Microsoft.SharePoint.SPBasePermissions]::EmptyMask

Test for a permission:
$permissionTest = ($customRole.BasePermissions -band [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems) -eq [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems

Save a permission level:
$customRole.Name = "Your custom permission"
$customRole.Description = "Description of your custom permission level"

July 6, 2015

Accessing and Manipulating Property Bags via the ECMAScript Client Object Model

Recently I work a lot with web applications implemented on the client side with JavaScript, mostly using the AngularJS library. Rather often should I build the application logic on the values stored in the property bags of the web objects, so I decided to sum up the experience I made in this field.

Note: A similar blog entry discussing the same topic can be found here. It might be useful to read that one as well, but I include additional info in my entry as well.

Note 2: The code samples in my post are borrowed from our custom AngularJS service, but the bulk of them should be reusable for any kind of JavaScript solution.

The first example shows how to retrieve the property values:

  1. this.readSettings = function ($scope) {
  2.     var deferred = $q.defer();
  4.     var ctx = SP.ClientContext.get_current();
  6.     var web = ctx.get_web();
  7.     var props = web.get_allProperties();
  8.     ctx.load(props);
  10.     ctx.executeQueryAsync(
  11.         function () {
  12.             // you receive an error if the property is not defined for the web:
  13.             // The property or field has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.
  14.             var yourProperty = props.get_item("YourProperty");
  15.             // or alternativelyyou can use
  16.             //var allProps = props.get_fieldValues();
  17.             //var yourProperty = allProps.YourProperty;
  18.             // in this second case you can  check, if the property is defined …
  19.             //if (typeof (yourProperty) == "undefined") {
  20.             //    console.log("The property is undefined");
  21.             //}
  22.             //…or iterate through all the properties
  23.             //for (var property in allProps) {
  24.             //    if (allProps.hasOwnProperty(property)) {
  25.             //        console.log(String.format("{0}: {1}", property, allProps[property]));
  26.             //    }
  27.             //}
  29.             deferred.resolve(
  30.                 {
  31.                     projectId: projectId
  32.                 });
  33.         },
  34.         function (sender, args) {
  35.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  36.         }
  37.     );
  39.     return deferred.promise;
  40. };

If you trace the network traffic with Fiddler, you will capture the following (or similar) request-response:


As you can see, all of the properties from the bag are returned by the server, they are in the form of simple property name – property value pairs (see response for Query with Id=6).

When you already have the response from the server, there are two different methods to get the value of a specific property. There is a very important difference between the two versions (get_item method vs. get_fieldValues method). If the property is undefined in the property bag, you receive an error, if you would like to read the value via props.get_item("YourProperty"):

The property or field has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.

However, if you access the property via the get_fieldValues method, you don’t get the exception, even if there is no such property in the property bag. In this case the property remain simply undefined, so you can check its existence via

var allProps = props.get_fieldValues();
if (typeof (allProps.YourProperty) == "undefined") {
  console.log("The property is undefined");

In this case the properties from the bag are available as simple JavaScript properties from code:


You can dump (or process) all properties defined in the bag via this code (taken from this thread):

var allProps = props.get_fieldValues();
for (var property in allProps) {
  if (allProps.hasOwnProperty(property)) {
    console.log(String.format("{0}: {1}", property, allProps[property]));

A sample output of the script in the debugging console:


Unfortunately, I have not found any way to limit the scope of properties returned by the query to specific properties. All of the properties in the property bag are returned by the request. The GetProperty method of the SPWeb class, that on the server side makes it possible to access the value of a single property, is not implemented in the client object model.

If you try this one:

var props = web.get_allProperties();
ctx.load(props, "Include(YourProperty)");

you will receive an exception complaining about the invalid request, as you call the executeQueryAsync method.

Saving the value back to the web properties is a straightforward operation:

  1. this.saveSettings = function ($scope) {
  2.     var deferred = $q.defer();
  4.     var ctx = new SP.ClientContext();
  6.     var web = ctx.get_web();
  7.     var props = web.get_allProperties();        
  8.     props.set_item("YourProperty", $scope.propValue);
  9.     web.update();
  11.     ctx.executeQueryAsync(
  12.         function () {
  13.             deferred.resolve();
  14.         },
  15.         function (sender, args) {
  16.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  17.         }
  18.     );
  20.     return deferred.promise;
  21. }

July 2, 2015

Sending mails from a test SharePoint system, Updated for SharePoint 2013

Filed under: Mails, Reflection, SP 2013 — Tags: , , — Peter Holpar @ 23:48

Last year I published a post about how to redirect mails sent from your SharePoint application to a list, storing the mails as attachments for the list items. Unfortunately, the Reflection-based approach described there seems to no longer work in SharePoint 2013. My goal with the current post is to provide an alternative solution to the problem.

The first issue is with the signature of the private Send method of the MailMessage class we used to save the content of the mail into a MemoryStream: it has been extended with an extra bool parameter (allowUnicode) since the .NET  4.5 version. This problem could be easily fixed by altering the method invocation with the extra parameter, as:

sendMethod.Invoke(message, BindingFlags.Instance | BindingFlags.NonPublic, null, new[] { mailWriter, true, true }, null);

If you test the code in a simple console application, it works, the mail message will be saved into the MemoryStream. However, if you try to use the same code from a SharePoint application (an application referencing the server side SharePoint assemblies: either a console application or web part), you become a nasty exception:

An unhandled exception of type ‘System.AccessViolationException’ occurred
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

After a short research, it turned out that the reason is the x64 process the SharePoint server side assemblies require. Since we can not change that requirement, I had to find an alternative solution. It was easy, as the stackoverflow thread the original solution originate from already contains an other option that plays with the DeliveryMethod of the SmtpClient class, saving the mail into a file in a temporary folder in the file system. Based on that answer I’ve altered the code of my DummyMailSender.

First, I’ve defined the helper class TemporaryDirectory to clean up the rest after the work:

  1. internal class TemporaryDirectory : IDisposable
  2. {
  3.     public TemporaryDirectory()
  4.     {
  5.         DirectoryPath = Path.Combine(Path.GetTempPath(), Path.GetRandomFileName());
  6.         Directory.CreateDirectory(DirectoryPath);
  7.     }
  9.     public string DirectoryPath { get; private set; }
  11.     public void Dispose()
  12.     {
  13.         if (Directory.Exists(DirectoryPath))
  14.         {
  15.             Directory.Delete(DirectoryPath, true);
  16.         }
  17.     }
  18. }

and another class Extensions that implements an extension method to save the mail into a byte array:

  1. internal static class Extensions
  2. {
  3.     public static byte[] GetAsByteArray(this MailMessage m)
  4.     {
  5.         byte[] messageData = null;
  7.         var smtpClient = new SmtpClient { DeliveryMethod = SmtpDeliveryMethod.SpecifiedPickupDirectory };
  9.         using (var tempDir = new TemporaryDirectory())
  10.         {
  11.             smtpClient.PickupDirectoryLocation = tempDir.DirectoryPath;
  12.             smtpClient.Send(m);
  13.             var emlFile = Directory.GetFiles(smtpClient.PickupDirectoryLocation).FirstOrDefault();
  14.             if (emlFile != null)
  15.             {
  16.                 // read all file contents and trim the carriage return / new line at the end
  17.                 var messageAsText = File.ReadAllText(emlFile).Trim();
  18.                 System.Text.ASCIIEncoding enc = new System.Text.ASCIIEncoding();
  19.                 messageData = enc.GetBytes(messageAsText);
  20.             }
  21.         }
  22.         return messageData;
  23.     }
  24. }

Having these classes that perform the bulk of the functionality, the DummyMailSender can be simplified as:

  1. public void Send(MailMessage message)
  2. {
  3.     Trace.TraceInformation("Mail (subject: '{0}') sending started via DummyMailSender", message.Subject);
  5.     byte[] messageData = message.GetAsByteArray();
  7.     // RootWeb mustn't be disposed, see:
  8.     //
  9.     SPWeb rootWeb = _site.RootWeb;
  10.     bool allowUnsafeOriginal = rootWeb.AllowUnsafeUpdates;
  11.     try
  12.     {
  13.         rootWeb.AllowUnsafeUpdates = true;
  14.         SPList mailList = rootWeb.Lists[Constants.DummyMailList];
  15.         SPListItem mailItem = mailList.AddItem();
  16.         mailItem[SPBuiltInFieldId.Title] = message.Subject;
  17.         mailItem.Attachments.Add("Mail.eml", messageData);
  18.         mailItem.Update();
  20.         string mailUrl = string.Format("{0}{1}?ID={2}", _site.Url, mailList.DefaultDisplayFormUrl, mailItem.ID);
  21.         Trace.TraceInformation("Dummy mail with subject '{0}' \"sent\" to '{1}' (cc: '{2}'), saved to '{3}'", message.Subject, message.To, message.CC, mailUrl);
  22.     }
  23.     catch (Exception ex)
  24.     {
  25.         Trace.TraceError("Error sending dummy mail: {0}\r\n{1}", ex.Message, ex.StackTrace);
  26.     }
  27.     finally
  28.     {
  29.         rootWeb.AllowUnsafeUpdates = allowUnsafeOriginal;
  30.     }
  31. }

This new version of DummyMailSender might perform not so well, as the original, Reflection-based one, but it should not be a major issue in a test system it was planned for, and at least it is a supported solution. At least, you don’t have to worry about that it won’t work after a newer .NET Framework version.

Older Posts »

The Shocking Blue Green Theme. Create a free website or blog at


Get every new post delivered to your Inbox.

Join 57 other followers