Second Life of a Hungarian SharePoint Geek

October 1, 2015

Change in the User Resolution in SharePoint 2013 People Picker

Filed under: Active Directory, People Picker, PowerShell, SP 2013 — Tags: , , , — Peter Holpar @ 22:31

After a SharePoint 2010 to SharePoint 2013 migration our users complained, that in the multiple Active Directory domain environment they have (I wrote about it recently) the People Picker does not resolve the users the same way it did earlier. Only a subset of the users was resolved, users from a few domains were not included in the results at all.

The reason of this issue is a change in the GetTrustedDomains method of the Microsoft.SharePoint.Utilities.SPUserUtility class. Now (in SP 2013) it includes an extra condition, checking the value of  SPWebService.ContentService.PeoplePickerSearchInMultipleForests.

If you need the same behavior as in the SP 2010 version, you should set the value of  the PeoplePickerSearchInMultipleForests property to true.

You can achieve it using PowerShell:

$cs = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$cs.PeoplePickerSearchInMultipleForests = $true

or via C#:

SPWebService.ContentService.PeoplePickerSearchInMultipleForests = true;

The SharePoint Time Machine

Filed under: SP 2013, Tips & Tricks — Tags: , — Peter Holpar @ 22:06

Assume you have a SharePoint list with a lot of items. The list supports versioning and you should provide a snapshot of the items at a given time in the past.

As you know, the Versions property (of type SPListItemVersionCollection) of the SPListItem class contains the item versions. One can access a specific version via the indexer property of the collection, by the ID of the version (where the ID = 512 * major version number + minor version number), or by the version number (a.k.a. label, for example, 2.3), but there is no direct support to get the actual version at a specific time in the past.

To achieve my goal, I’ve implemented the GetVersionFromDate extension method, that iterates through the method, and returns the version we need based on its creation date:

  1. public static SPListItemVersion GetVersionFromDate(this SPListItemVersionCollection versions, DateTime localDate)
  2. {
  3.     SPListItemVersion result = null;
  5.     if (versions != null)
  6.     {
  7.         DateTime date = versions.ListItem.Web.RegionalSettings.TimeZone.LocalTimeToUTC(localDate);
  9.         SPListItemVersion prevVersion = null;
  11.         // versions[0] – current item version
  12.         // versions[versions.Count – 1] – first item version created
  13.         for (int i = versions.Count – 1; i >= 0; i–)
  14.         {
  15.             SPListItemVersion version = versions[i];
  16.             if (version.Created > date)
  17.             {
  18.                 result = prevVersion;
  19.                 break;
  20.             }
  21.             // if it is the last (actual) version and there is no result yet,
  22.             // then the date specified should be greater than the creation date of the last version
  23.             // we take the last version
  24.             else if (i == 0)
  25.             {
  26.                 result = version;
  27.             }
  29.             prevVersion = version;
  30.         }                
  32.     }
  34.     return result;
  35. }

Note, that the Created property stores the creation date as UTC time, that we should convert first.

Using this method accessing the specific version is so simple as:

  1. SPList list = web.Lists["Your List"];
  2. SPListItem item = list.Items.GetItemById(1);
  4. DateTime date = DateTime.Parse("2015/06/29 13:40");
  5. SPListItemVersion version = item.Versions.GetVersionFromDate(date);
  6. Console.WriteLine(version["APropertyName"]);

If you go through the items in the list and get the version of the specific time, you already have the required snapshot.

September 29, 2015

People Picker is very slow when searching users

Filed under: Active Directory, People Picker, SP 2010 — Tags: , , — Peter Holpar @ 22:12

The environment of a customer of us consists of several Active Directory domains, a few of them were recently migrated from former domains.

Users of the SharePoint sites complained that when they try to look up users via the People Picker, the result is displayed only after a delay of  30-40 seconds, instead of the former 3-5 seconds.

I’ve tried to catch the problem using Wireshark, filtering for the LDAP protocol, as described in this post. However, I found no problem with the requests / responses, except for a delay of about 30 seconds, although no request using this protocol was sent in this time lag. Obviously, the sender process waited for a response sent using another protocol.

Removing the LDAP filter in Wireshark, I found these retransmission attempts:

No.     Time            Source                        Destination   Protocol Length  Info
3241    44.218621000    IP of the SharePoint Server   IP of the DC    TCP    66    53607 > msft-gc [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=256 SACK_PERM=1
3360    47.217136000    IP of the SharePoint Server   IP of the DC    TCP    66    [TCP Retransmission] 53607 > msft-gc [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=256 SACK_PERM=1
3791    53.221414000    IP of the SharePoint Server   IP of the DC    TCP    62    [TCP Retransmission] 53607 > msft-gc [SYN] Seq=0 Win=8192 Len=0 MSS=1460 SACK_PERM=1

The msft-gc is an LDAP-like protocol used to query the Global Catalog (GC) in the Active Directory (uses port 3268). The retransmission timeout (RTO) value of the packet 3360 was 3 sec., the RTO of the packet 3791 was 9 sec., both causing delay in the user search process.

The source IP was the address of the SharePoint server, the IP address in the destination is the address of a former Domain Controller (DC). The server, that acted as DC of a domain that was already migrated was online, but the DC-role was already demoted on it . The IP address of the server was registered in DNS, so the server could be PINGed, but it did not respond to LDAP requests (including msft-gc) anymore.

The entries in the ULS logs has provided further evidence, that there is an issue with the Global Catalog in the AD forest (see the SearchFromGC method in the stack trace below):.

08/06/2015 13:26:34.08     w3wp.exe (0x66BC)                           0x9670    SharePoint Foundation             General                           72e9    Medium      Error in resolving user ‘UserName‘ : System.Runtime.InteropServices.COMException (0x8007203A): The server is not operational.       at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail)     at System.DirectoryServices.DirectoryEntry.Bind()     at System.DirectoryServices.DirectoryEntry.get_AdsObject()     at System.DirectoryServices.DirectorySearcher.FindAll(Boolean findMoreThanOne)     at Microsoft.SharePoint.WebControls.PeopleEditor.SearchFromGC(SPActiveDirectoryDomain domain, String strFilter, String[] rgstrProp, Int32 nTimeout, Int32 nSizeLimit, SPUserCollection spUsers, ArrayList& rgResults)     at Microsoft.SharePoint.Utilities.SPUserUtility.ResolveAgainstAD(String input, Boolean inputIsEmailOnly, SPActiveDirectoryDomain globalCatalog, SPPrincipalType scopes, SPUserCo…    04482a74-c00f-4005-9cd3-11f765eca7a0
08/06/2015 13:26:34.08*    w3wp.exe (0x66BC)                           0x9670    SharePoint Foundation             General                           72e9    Medium      …llection usersContainer, TimeSpan searchTimeout, String customFilter)     at Microsoft.SharePoint.Utilities.SPActiveDirectoryPrincipalResolver.ResolvePrincipal(String input, Boolean inputIsEmailOnly, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer)     at Microsoft.SharePoint.Utilities.SPUtility.ResolvePrincipalInternal(SPWeb web, SPWebApplication webApp, Nullable`1 urlZone, String input, SPPrincipalType scopes, SPPrincipalSource sources, SPUserCollection usersContainer, Boolean inputIsEmailOnly, Boolean alwaysAddWindowsResolver).    04482a74-c00f-4005-9cd3-11f765eca7a0

Removing the orphaned DC entry from the AD  resolved the People Picker problem as well.

August 7, 2015

How to process the output of the stsadm EnumAllWebs operation

Filed under: PowerShell, SP 2010, Upgrade — Tags: , , — Peter Holpar @ 10:48

Recently I wrote about how to process the output of the Test-SPContentDatabase PowerShell cmdlet.

If you use the EnumAllWebs operation of the stsadm command, you have similar options as well. However, you have the output in XML format in this case, so we should use XPath expressions to access the information we need.

First, save the result into a text file:

stsadm -o EnumAllWebs -DatabaseName YourContentDB -IncludeFeatures -IncludeWebParts >> C:\Output.xml

then load its content into an XML object:

$reportXml = [Xml] (Get-Content C:\Output.xml)

To list the missing features:

Select-Xml -Xml $reportXml -XPath ‘//Site’ | % { $siteId = $_.Node.Id; Select-Xml -Xml $_.Node -XPath ‘Features/Feature[@Status="Missing"]/@Id’ | % { (Get-SPFeature -Site $siteId -Identity $_.Node.Value).DisplayName } }

Note, that site templates created by saving an existing site as a site template from the UI might be reported as missing features as explained here.

To get an overview of the site templates:

Select-Xml -Xml $reportXml -XPath ‘//Web/@TemplateName’ | % { $_.Node.Value } | Group { $_ }


To get an overview of the missing web-scoped features:

Select-Xml -Xml $reportXml -XPath ‘//Web/Features/Feature[@Status="Missing"]/@Id’ | % { $_.Node.Value } | Group { $_ } | % { Write-Host $_.Name – Count is $_.Count }

Finally, to get an overview of the missing web parts:

Select-Xml -Xml $reportXml -XPath ‘//Web/WebParts/WebPart[@Status="Missing"]/@Id’ | % { $_.Node.Value } | Group { $_ } | % { Write-Host $_.Name – Count is $_.Count }

Computing Hash of SharePoint Files

Filed under: PowerShell, SP 2010 — Tags: , — Peter Holpar @ 10:46

Recently we had to compare master pages of hundreds of sites to find out, which ones contain customizations. Comparing the content of files means in practice typically to compare the hash values calculated based on the file content.

But how to compute the hash of a file stored in SharePoint?

If you had the SharePoint sites mapped to the Windows File System via WebDAV, you could use the Get-FileHash cmdlet from PowerShell 4.0 (or above). In PowerShell 5.0 the Get-FileHash cmdlet is able to compute the hash of input streams as well, so we could use SPFile.OpenBinaryStream method to access the file content, and then compute its hash value.

Since I have only PowerShell 2.0 on my SharePoint server, I had to create my own hashing solution, the Get-SPFileHash function:

function Get-SPFileHash($fileUrl) 

  $site = New-Object Microsoft.SharePoint.SPSite $fileUrl
  $web = $site.OpenWeb()
  $file = $web.GetFile($fileUrl)
  $bytes = $file.OpenBinary()
  $md5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
  $hash = [System.BitConverter]::ToString($md5.ComputeHash($bytes))
  return $hash

Having this method, it is easy to create a summary of all sites, their master pages, and the hash values of the master page content:

$siteUrl = "http://YourSharePointSite"

$site = Get-SPSite $siteUrl

$site.AllWebs | % {
  $obj = New-Object PSObject
  $obj | Add-Member NoteProperty Url($_.Url)
  $obj | Add-Member NoteProperty MasterUrl($_.MasterUrl)
  $obj | Add-Member NoteProperty FileHash(Get-SPFileHash ($siteUrl + $_.MasterUrl))
  Write-Output $obj
} | Export-CSV "C:\data\MasterPageHash.csv"

You can import the resulting CSV file into Excel and process its content as you wish.

Note: If you simply double-click on the CSV file created by the former script in Windows Explorer, it is not opened in the format you probably wish: values separated into columns. Instead of that, the first column would contain the entire line. You should first prepare the file: open it in Notepad, optionally remove the first header line, and save the file again, changing the encoding from ANSI to Unicode. Next, start Excel, and open the CSV file from Excel, setting the separator character to Comma on the second page of the Text Import Wizard.

August 2, 2015

PowerShell Scripts around the SID

Filed under: Active Directory, Migration, PowerShell, SP 2010 — Tags: , , , — Peter Holpar @ 23:38

If you ever migrated SharePoint users you should be familiar either with the Move-SPUser cmdlet or its predecessor, the migrateuser stsadm operation:

$sourceURL = ""
$web = Get-SPWeb $sourceURL
$user = $web.SiteUsers["domain\jdoe"]
Move-SPUser -Identity $user -NewAlias "newDomain\john.doe" –IgnoreSID


stsadm -o migrateuser –oldlogin "domain\jdoe" -newlogin "newDomain\john.doe" -ignoresidhistory

As you see, both method relies on the SID (or on its ignorance), but what is this SID and how can we read its value for our SharePoint or Active Directory users?

Each user in the Active Directory (AD) has a security identifier (SID) that is a unique, immutable identifier, allowing the user to be renamed without affecting its other properties.

Reading the SID of a SharePoint user from PowerShell is so simple as:

$web = Get-SPWeb
$user = $web.AllUsers["domain\LoginName"]

To be able to work with Active Directory from PowerShell, you need of course the Active Directory cmdlets. If your machine has no role in AD, you should install this PowerShell module using the steps described in this post.

Once you have this module installed, and you imported it via “Import-Module ActiveDirectory”, you can read the SID of a user in AD:

$user = Get-ADUser UserLoginNameWithoutDomain -Server

Where UserLoginNameWithoutDomain is the login name of the user without the domain name, like jdoe in case of domain\jdoe, and is your DC responsible for the domain of your user.

If you need the SID history from AD as well, it’s a bit complicated. In this case I suggest you to read this writing as well.

$ADQuery = Get-ADObject –Server`
        -LDAPFilter "(samAccountName=UserLoginNameWithoutDomain )" `
        -Property objectClass, samAccountName, DisplayName, `
        objectSid, sIDHistory, distinguishedname, description, whenCreated |
        Select-Object * -ExpandProperty sIDHistory
$ADQuery | % { 
  Write-Host $_.samAccountName
  Write-Host Domain $_.AccountDomainSid.Value 
  Write-Host SID History
  $_.sIDHistory | % {
  Write-Host ——————–

Tasks regarding to MySites Migration and Automating them via PowerShell

Filed under: Migration, PowerShell, SP 2010 — Tags: , , — Peter Holpar @ 22:40

Recently we have performed a domain migration for a customer, where we had to migrate the MySites of the users as well. In this blog post I share the relevant PowerShell scripts we used to support the migration.In our case it was a SharePoint 2010 farm, however for SharePoint 2013 you should have the same tasks as well, so hopefully you find the scripts useful.

The user naming convention has been changed during the migration, for example a user John Doe had a login name in the source domain (let’s call it simply domain) like jdoe, he has a new login name john.doe in the target domain (let’s call it newDomain).

As you now, each MySite is a separate site collection under a site collection root (like, the last part of the site collection URL is built based on the login name (for example, it was originally Of course, the customer wanted the MySite URLs to reflect the changes in the login name naming conventions (it should be changed

First, we had to migrate the SharePoint user and its permissions using the Move-SPUser cmdlet:

$sourceURL = ""
$web = Get-SPWeb $sourceURL
$user = $web.SiteUsers["domain\jdoe"]
Move-SPUser -Identity $user -NewAlias "newDomain\john.doe" –IgnoreSID

We cannot simply change the URL of the site collection. We have to backup it and restore using the new URL as described in this post and illustrated here:

$sourceURL = ""
$targetURL = ""

# Location for the backup file
$backupPath = "E:\data\mysite.bak"

   # Set the Error Action
   $ErrorActionPreference = "Stop"

  Write-Host "Backing up the Source Site Collection…"-ForegroundColor DarkGreen
  Backup-SPSite $sourceURL -Path $backupPath -force
  Write-Host "Backup Completed!`n"

  # Delete source Site Collection
  Write-Host "Deleting the Source Site Collection…"
  Remove-SPSite -Identity $sourceURL -Confirm:$false
  Write-Host "Source Site Deleted!`n"

  # Restore Site Collection to new URL
  Write-Host "Restoring to Target Site Collection…"
  Restore-SPSite $targetURL -Path $backupPath -Confirm:$false
  Write-Host "Site Restored to Target!`n"

  # Remove backup files
  Remove-Item $backupPath
  Write-Host "Operation Failed. Find the Error Message below:" -ForegroundColor Red
  Write-Host $_.Exception.Message -ForegroundColor Red
   # Reset the Error Action to Default
   $ErrorActionPreference = "Continue"
Write-host "Process Completed!"

Of course, we have to change the MySite URL in the user profile properties as well as described here. We used the following script: 

$waUrl = ""
$wa = Get-SPWebApplication -Identity $waUrl

# Create Service Context for User Profile Manager
$context = Get-SPServiceContext $wa.Sites[0]

# Get User Profile Manager instance
$upm = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)

# Get the user profile for owner of the personal site
$up = $upm.GetUserProfile("newDomain\john.doe")
$up["PersonalSpace"].Value = "/personal/john.doe"

Each user is by default the primary site collection administrator of his own MySite. In my former posts I already discussed how we can change the primary site collection administrator with or without elevated permissions. See this posts for reference to change the account to the one from the new domain.

For example, the simplest version:

$targetURL = ""
$siteAdmin = New-Object Microsoft.SharePoint.Administration.SPSiteAdministration($targetURL )
$siteAdmin.OwnerLoginName = "newDomain\john.doe"

July 23, 2015

Managing Project Server Views via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 07:17

If you would like to manage Project Server views from code you will find very few helpful resources (if any) on the web. The object models simply do not include classes related to this (neither on the server side nor on the client side). Although the PSI contains a View service, it is intended for internal use. Of course, that intention could not stop us to use the service at our own risk. Below I give you some useful code samples to illustrate the usage of the View service.

First of all, we create the proxy assembly, load the required Microsoft.Office.Project.Server.Library assembly in the process as well, and define some shortcuts to make it easier to reference enum and property values later on.

$pwaUrl = "http://YourProjectServer/pwa"
$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/PSI/View.asmx?wsdl") -UseDefaultCredential
$ViewConstants = [Microsoft.Office.Project.Server.Library.ViewConstants]
$ViewType = [Microsoft.Office.Project.Server.Library.ViewConstants+ViewType]

If you now the unique ID of your view, it is easy to display all of the fields and security categories associated with the view:

$viewId = [Guid]"63d3499e-df27-401c-af58-ebb9607beae8"
$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields | % { $_.CONV_STRING }
$view.SecurityCategoryObjects | % { $_.WSEC_CAT_NAME }

If the view ID is unknown, you can get it based on the name and type of the view:

$viewName = "Your Report"
$viewType = $ViewType::PORTFOLIO

$views = $svcPSProxy.ReadViewSummaries()
$viewId = ($views.ViewReports | ? { $_.WVIEW_NAME -eq $viewName -and $_.WVIEW_TYPE -eq $viewType }).WVIEW_UID

You can list all of the views:

$views = $svcPSProxy.ReadViewSummaries()
$views.ViewReports | % {
  Write-Host $_.WVIEW_NAME ($ViewType$_.WVIEW_TYPE)

To change the order of the first two fields in the view:

$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields[0].WVIEW_FIELD_ORDER = 1
$view.ViewReportFields[1].WVIEW_FIELD_ORDER = 0

To change the order of two arbitrary fields (based on their name) in the view:

$fieldName1 = "Finish"
$fieldName2 = "Owner"
$view = $svcPSProxy.ReadView($viewId)
$field1 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName1 }
$field2 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName2 }
$field1Order = $field1.WVIEW_FIELD_ORDER
$field2Order = $field2.WVIEW_FIELD_ORDER
$field1.WVIEW_FIELD_ORDER = $field2Order
$field2.WVIEW_FIELD_ORDER = $field1Order

To remove a field from a view:

$fieldToRemoveName = "Ende"
$view = $svcPSProxy.ReadView($viewId)
$fieldToRemove = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldToRemoveName }

To delete the view itself:


To create a new view using an existing view as a template:

$newViewName = "New View"
[Void]$svcPSProxy.CopyViewReports($viewId, $newViewName)
$newView = $svcPSProxy.ReadViewSummaries().ViewReports | ? { $_.WVIEW_NAME -eq $newViewName -and $_.WVIEW_TYPE -eq $viewType }

To list all of the fields available in a given type (in this case, for tasks):

$svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | % { $_.CONV_STRING }

To append a new field at the end of the fields in the view:

$fieldToAppendName = "% Work Complete"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }
$view = $svcPSProxy.ReadView($viewId)
$maxFieldOrder = ($view.ViewReportFields | % { $_.WVIEW_FIELD_ORDER } | measure -Maximum).Maximum

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WTABLE_UID = $fieldToAppend.WTABLE_UID
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $maxFieldOrder + 1
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value


To inject a new field in the view before another field having a specified name:

$fieldInjectBeforeName = "% Complete"
$fieldToInjectName = "% Work Complete"

$fieldToInject = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToInjectName }

$view = $svcPSProxy.ReadView($viewId)

$fieldInjectBeforeOrder = ($view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldInjectBeforeName }).WVIEW_FIELD_ORDER

$view.ViewReportFields | ? { $_.WVIEW_FIELD_ORDER -ge $fieldInjectBeforeOrder } | % { $_.WVIEW_FIELD_ORDER++ }

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToInject.WFIELD_UID
$newField.CONV_STRING = $fieldToInject.CONV_STRING
$newField.WTABLE_UID = $fieldToInject.WTABLE_UID
$newField.WFIELD_NAME_SQL = $fieldToInject.WFIELD_NAME_SQL
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $fieldInjectBeforeOrder
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value


The last code sample shows how to create a new Gantt-view from scratch, appending a single field and a single security category to it:

$viewRepDS = New-Object PSIProxy.PWAViewReportsDataSet
$newView = $viewRepDS.ViewReports.NewViewReportsRow()
$newView.WVIEW_UID = [Guid]::NewGuid()
$newView.WVIEW_NAME = "New Report 2"
$newView.WVIEW_DESCRIPTION = "Test report description"

$fieldToAppendName = "% Arbeit abgeschlossen"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }

$newField = $viewRepDS.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WVIEW_UID = $newView.WVIEW_UID
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value

$newSecCat = $viewRepDS.SecurityCategoryObjects.NewSecurityCategoryObjectsRow()
$newSecCat.WSEC_CAT_UID = [Microsoft.Office.Project.Server.Library.PSSecurityCategory]::MyProjects
$newSecCat.WSEC_OBJ_TYPE_UID = [Microsoft.Office.Project.Server.Library.PSSecurityObjectType]::View
$newSecCat.WSEC_OBJ_UID = $newView.WVIEW_UID

$newView.WVIEW_TYPE = $ViewType::PORTFOLIO
$newView.WGANTT_SCHEME_UID =  $ViewConstants::GanttSchemeUidProjectCenter
#  Group by (see [pub].[MSP_WEB_GROUP_SCHEMES] table in Project DB for possible values)
$newView.WGROUP_SCHEME_UID = [Guid]::Empty


July 22, 2015

Create Project Server Enterprise Custom Fields via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 22:38

Last year I already wrote about how one can manage the Project Server Enterprise Custom Fields via the Managed Client Object Modell. We could transfer the code samples of that post from C# to PowerShell, but because of the limitations of the Managed Client Object Modell I use the PSI interface instead in this case. What are those limitations? Not all of the properties available in PSI are exposed by the Client OM, see for example the MD_PROP_SUMM_GRAPHICAL_INDICATOR field, that we can use to set the rules of graphical indicators defined for the fields. I’ll show you an example for getting and setting the indicator rules in a later post, in the current one I only show you the technique we can use to create the Enterprise Custom Fields via PSI.

One can find an existing description with code sample in Step 3 and 4 of this post, that achieves the same goal, however, I don’t like that approach for several reasons, for example, because of  we have to generate the proxy assembly based on the WSDL in the code itself. Instead of that I find the following code much more simple:


$pwaUrl = "http://YourProjectServer/pwa"

# create shortcuts
$PSDataType = [Microsoft.Office.Project.Server.Library.PSDataType]
$Entities = [Microsoft.Office.Project.Server.Library.EntityCollection]::Entities

$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/psi/CustomFields.asmx?wsdl") -UseDefaultCredential

$customFieldDataSet = New-Object PSIProxy.CustomFieldDataSet 

$customFieldRow = $customFieldDataSet.CustomFields.NewCustomFieldsRow()   
$customFieldRow.MD_PROP_UID = [Guid]::NewGuid()
$customFieldRow.MD_PROP_NAME = "Custom Project Field"
$customFieldRow.MD_PROP_TYPE_ENUM = $PSDataType::STRING
$customFieldRow.MD_ENT_TYPE_UID = $Entities.ProjectEntity.UniqueId
$customFieldRow.MD_PROP_IS_REQUIRED = $false
$customFieldRow.MD_PROP_IS_LEAF_NODE_ONLY = $false
$customFieldRow.MD_PROP_DESCRIPTION = "Test Field Desc."

$svcPSProxy.CreateCustomFields($customFieldDataSet, $false, $true)

If you have casting issues when using the Namespace parameter of the New-WebServiceProxy cmdlet, you should read this post.

Creating a PowerShell-based Monitoring and Alerting System for Project Server

Filed under: PowerShell, Project Server — Tags: , — Peter Holpar @ 22:08

A few months ago I published a post about how to find the jobs in the Project Server queue programmatically. In the current post I will show you, how can you use PowerShell to track the number of jobs in queue, and send an e-mail alert, if the count is higher than a predefined limit for a longer period. Although the example in this post is Project Server specific, you can use the same technique to create other types of alerts as well.

Since the PowerShell script will be run by Windows Task Scheduler (for example on a 5-minute schedule) it was an important question, how to solve the communication between the runs. For example, how the current session can find out, since when the counter is higher than the limit? Of course, if the limit is reached, and we have already sent a mail, we would not like to send further mails for every and each runs till the counter is higher than the limit. But how to inform the forthcoming sessions from the current session, that we have sent a mail? Of course, there are many possible solutions for this problem. We could use a database, or a file (either XML or any custom format) to persist the information between the sessions. I’ve chosen an even simpler approach. I’ve create empty files (QueueLimitReached.txt and MailSent.txt), and check their existence and / or creation date to check when the limit has been reached and if the alert mail has been already sent. If the counter goes below the limit again, I simply delete these semaphore files.

Having this background, the script itself should be already straightforward.

  1. Add-PSSnapin "Microsoft.SharePoint.PowerShell"
  3. $folderPath = "D:\ScheduledTasks\"
  4. $limitReachedFileName = "QueueLimitReached.txt"
  5. $mailSentFileName = "MailSent.txt"
  6. $ageOfFileLimit = 15 # in minutes
  7. $counterValueLimit = 50
  9. $emailTo = ""
  10. $emailCc = ";"
  11. $emailSubject = "Project Server Queue Alert"
  12. $emailBody = @"
  13. Hi,
  15. the count of the jobs in the Project Server Queue is very high. Please, fix the issue!
  17. Regards,
  18. The PowerShell Monitor
  19.   "@
  21. $limitReachedFilePath = $folderPath + $limitReachedFileName
  22. $mailSentFilePath = $folderPath + $mailSentFileName
  24. function HasAlertState()
  25. {
  26.   $counter = Get-Counter -Counter "\ProjectServer:QueueGeneral(_Total)\Current Unprocessed Jobs"
  27.   $counterValue = $counter.CounterSamples[0].CookedValue
  28.   return ($counterValue -gt $counterValueLimit)
  29. }
  31. function SendAlert()
  32. {   
  33.   $globalAdmin = New-Object Microsoft.SharePoint.Administration.SPGlobalAdmin
  35.   $smtpMail = New-Object Net.Mail.MailMessage
  36.   $smtpMail.From = $globalAdmin.MailFromAddress
  37.   $smtpMail.Subject = $emailSubject
  38.   $smtpMail.Body = $emailBody
  39.   $emailTo.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.To.Add($mailAddr) }
  40.   $emailCc.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.Cc.Add($mailAddr) }
  41.   $smtpMail.ReplyTo = New-Object Net.Mail.MailAddress($globalAdmin.MailReplyToAddress)
  42.   $smtpMail.BodyEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  43.   $smtpMail.SubjectEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  45.   $smtpClient = New-Object Net.Mail.SmtpClient($globalAdmin.OutboundSmtpServer)
  46.   $smtpClient.Send($smtpMail)
  47. }
  49. $alertCondition = HasAlertState
  51. If ($alertCondition)
  52. {
  53.   If (Test-Path $limitReachedFilePath)
  54.   {
  55.     $creationTime = (Get-ChildItem $limitReachedFilePath).CreationTime
  56.     $ageOfFile = ([DateTime]::Now – $creationTime).Minutes
  57.     Write-Host $ageOfFile
  58.     If ($ageOfFile -gt $ageOfFileLimit)
  59.     {
  60.       Write-Host Limit reached
  61.       If (-not (Test-Path $mailSentFilePath))
  62.       {
  63.         Write-Host Mail has not yet been sent. Send it now.
  64.         SendAlert
  65.         # suppress return value via casting it to null
  66.         [void] (New-Item -name $mailSentFileName -path $folderPath -itemType File)
  67.       }
  68.     }
  69.   }
  70.   # create a new file, if no former one exists
  71.   else
  72.   {
  73.     If (-not (Test-Path $limitReachedFilePath))
  74.     {
  75.       # suppress return value via casting it to null
  76.       [void] (New-Item -name $limitReachedFileName -path $folderPath -itemType File)
  77.     }
  78.   }
  79. }
  80. # delete the former files, if they exist
  81. Else
  82. {
  83.   If (Test-Path $limitReachedFilePath)
  84.   {
  85.     Remove-Item $limitReachedFilePath
  86.   }
  87.   If (Test-Path $mailSentFilePath)
  88.   {
  89.     Remove-Item $mailSentFilePath
  90.   }
  91. }

In the sample we check the value of the Current Unprocessed Jobs counter of Project Server. You can easily change the limit of  job count (50), and the time period (15 minutes) in the code, or customize the addressees, subject and body of the mail. If you would like to create other types of alerts, you should simply implement your own version of the HasAlertState method.

Older Posts »

The Shocking Blue Green Theme. Create a free website or blog at


Get every new post delivered to your Inbox.

Join 61 other followers