Second Life of a Hungarian SharePoint Geek

August 7, 2015

How to process the output of the stsadm EnumAllWebs operation

Filed under: PowerShell, SP 2010, Upgrade — Tags: , , — Peter Holpar @ 10:48

Recently I wrote about how to process the output of the Test-SPContentDatabase PowerShell cmdlet.

If you use the EnumAllWebs operation of the stsadm command, you have similar options as well. However, you have the output in XML format in this case, so we should use XPath expressions to access the information we need.

First, save the result into a text file:

stsadm -o EnumAllWebs -DatabaseName YourContentDB -IncludeFeatures -IncludeWebParts >> C:\Output.xml

then load its content into an XML object:

$reportXml = [Xml] (Get-Content C:\Output.xml)

To list the missing features:

Select-Xml -Xml $reportXml -XPath ‘//Site’ | % { $siteId = $_.Node.Id; Select-Xml -Xml $_.Node -XPath ‘Features/Feature[@Status="Missing"]/@Id’ | % { (Get-SPFeature -Site $siteId -Identity $_.Node.Value).DisplayName } }

Note, that site templates created by saving an existing site as a site template from the UI might be reported as missing features as explained here.

To get an overview of the site templates:

Select-Xml -Xml $reportXml -XPath ‘//Web/@TemplateName’ | % { $_.Node.Value } | Group { $_ }

image

To get an overview of the missing web-scoped features:

Select-Xml -Xml $reportXml -XPath ‘//Web/Features/Feature[@Status="Missing"]/@Id’ | % { $_.Node.Value } | Group { $_ } | % { Write-Host $_.Name – Count is $_.Count }

Finally, to get an overview of the missing web parts:

Select-Xml -Xml $reportXml -XPath ‘//Web/WebParts/WebPart[@Status="Missing"]/@Id’ | % { $_.Node.Value } | Group { $_ } | % { Write-Host $_.Name – Count is $_.Count }

Computing Hash of SharePoint Files

Filed under: PowerShell, SP 2010 — Tags: , — Peter Holpar @ 10:46

Recently we had to compare master pages of hundreds of sites to find out, which ones contain customizations. Comparing the content of files means in practice typically to compare the hash values calculated based on the file content.

But how to compute the hash of a file stored in SharePoint?

If you had the SharePoint sites mapped to the Windows File System via WebDAV, you could use the Get-FileHash cmdlet from PowerShell 4.0 (or above). In PowerShell 5.0 the Get-FileHash cmdlet is able to compute the hash of input streams as well, so we could use SPFile.OpenBinaryStream method to access the file content, and then compute its hash value.

Since I have only PowerShell 2.0 on my SharePoint server, I had to create my own hashing solution, the Get-SPFileHash function:

function Get-SPFileHash($fileUrl) 

  $site = New-Object Microsoft.SharePoint.SPSite $fileUrl
  $web = $site.OpenWeb()
  $file = $web.GetFile($fileUrl)
  $bytes = $file.OpenBinary()
  $md5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
  $hash = [System.BitConverter]::ToString($md5.ComputeHash($bytes))
  return $hash
}

Having this method, it is easy to create a summary of all sites, their master pages, and the hash values of the master page content:

$siteUrl = "http://YourSharePointSite"

$site = Get-SPSite $siteUrl

$site.AllWebs | % {
  $obj = New-Object PSObject
  $obj | Add-Member NoteProperty Url($_.Url)
  $obj | Add-Member NoteProperty MasterUrl($_.MasterUrl)
  $obj | Add-Member NoteProperty FileHash(Get-SPFileHash ($siteUrl + $_.MasterUrl))
  Write-Output $obj
} | Export-CSV "C:\data\MasterPageHash.csv"

You can import the resulting CSV file into Excel and process its content as you wish.

Note: If you simply double-click on the CSV file created by the former script in Windows Explorer, it is not opened in the format you probably wish: values separated into columns. Instead of that, the first column would contain the entire line. You should first prepare the file: open it in Notepad, optionally remove the first header line, and save the file again, changing the encoding from ANSI to Unicode. Next, start Excel, and open the CSV file from Excel, setting the separator character to Comma on the second page of the Text Import Wizard.

August 2, 2015

PowerShell Scripts around the SID

Filed under: Active Directory, Migration, PowerShell, SP 2010 — Tags: , , , — Peter Holpar @ 23:38

If you ever migrated SharePoint users you should be familiar either with the Move-SPUser cmdlet or its predecessor, the migrateuser stsadm operation:

$sourceURL = "http://mysites.company.com"
$web = Get-SPWeb $sourceURL
$user = $web.SiteUsers["domain\jdoe"]
Move-SPUser -Identity $user -NewAlias "newDomain\john.doe" –IgnoreSID

or

stsadm -o migrateuser –oldlogin "domain\jdoe" -newlogin "newDomain\john.doe" -ignoresidhistory

As you see, both method relies on the SID (or on its ignorance), but what is this SID and how can we read its value for our SharePoint or Active Directory users?

Each user in the Active Directory (AD) has a security identifier (SID) that is a unique, immutable identifier, allowing the user to be renamed without affecting its other properties.

Reading the SID of a SharePoint user from PowerShell is so simple as:

$web = Get-SPWeb http://YourSharePoint.com
$user = $web.AllUsers["domain\LoginName"]
$user.Sid

To be able to work with Active Directory from PowerShell, you need of course the Active Directory cmdlets. If your machine has no role in AD, you should install this PowerShell module using the steps described in this post.

Once you have this module installed, and you imported it via “Import-Module ActiveDirectory”, you can read the SID of a user in AD:

$user = Get-ADUser UserLoginNameWithoutDomain -Server YourDomainController.company.com
$user.SID.Value

Where UserLoginNameWithoutDomain is the login name of the user without the domain name, like jdoe in case of domain\jdoe, and YourDomainController.company.com is your DC responsible for the domain of your user.

If you need the SID history from AD as well, it’s a bit complicated. In this case I suggest you to read this writing as well.

$ADQuery = Get-ADObject –Server YourDomainController.company.com`
        -LDAPFilter "(samAccountName=UserLoginNameWithoutDomain )" `
        -Property objectClass, samAccountName, DisplayName, `
        objectSid, sIDHistory, distinguishedname, description, whenCreated |
        Select-Object * -ExpandProperty sIDHistory
$ADQuery | % { 
  Write-Host $_.samAccountName
  Write-Host Domain $_.AccountDomainSid.Value 
  Write-Host SID History
  $_.sIDHistory | % {
    $_.Value     
  }
  Write-Host ——————–
}

Tasks regarding to MySites Migration and Automating them via PowerShell

Filed under: Migration, PowerShell, SP 2010 — Tags: , , — Peter Holpar @ 22:40

Recently we have performed a domain migration for a customer, where we had to migrate the MySites of the users as well. In this blog post I share the relevant PowerShell scripts we used to support the migration.In our case it was a SharePoint 2010 farm, however for SharePoint 2013 you should have the same tasks as well, so hopefully you find the scripts useful.

The user naming convention has been changed during the migration, for example a user John Doe had a login name in the source domain (let’s call it simply domain) like jdoe, he has a new login name john.doe in the target domain (let’s call it newDomain).

As you now, each MySite is a separate site collection under a site collection root (like http://mysites.company.com/personal), the last part of the site collection URL is built based on the login name (for example, it was originally http://mysites.company.com/personal/jdoe). Of course, the customer wanted the MySite URLs to reflect the changes in the login name naming conventions (it should be changed http://mysites.company.com/personal/john.doe)

First, we had to migrate the SharePoint user and its permissions using the Move-SPUser cmdlet:

$sourceURL = "http://mysites.company.com/personal/jdoe"
$web = Get-SPWeb $sourceURL
$user = $web.SiteUsers["domain\jdoe"]
Move-SPUser -Identity $user -NewAlias "newDomain\john.doe" –IgnoreSID

We cannot simply change the URL of the site collection. We have to backup it and restore using the new URL as described in this post and illustrated here:

$sourceURL = "http://mysites.company.com/personal/jdoe"
$targetURL = "http://mysites.company.com/personal/john.doe"

# Location for the backup file
$backupPath = "E:\data\mysite.bak"

Try
{
   # Set the Error Action
   $ErrorActionPreference = "Stop"

  Write-Host "Backing up the Source Site Collection…"-ForegroundColor DarkGreen
  Backup-SPSite $sourceURL -Path $backupPath -force
  Write-Host "Backup Completed!`n"

  # Delete source Site Collection
  Write-Host "Deleting the Source Site Collection…"
  Remove-SPSite -Identity $sourceURL -Confirm:$false
  Write-Host "Source Site Deleted!`n"

  # Restore Site Collection to new URL
  Write-Host "Restoring to Target Site Collection…"
  Restore-SPSite $targetURL -Path $backupPath -Confirm:$false
  Write-Host "Site Restored to Target!`n"

  # Remove backup files
  Remove-Item $backupPath
}
Catch
{
  Write-Host "Operation Failed. Find the Error Message below:" -ForegroundColor Red
  Write-Host $_.Exception.Message -ForegroundColor Red
}
Finally
{
   # Reset the Error Action to Default
   $ErrorActionPreference = "Continue"
}
 
Write-host "Process Completed!"

Of course, we have to change the MySite URL in the user profile properties as well as described here. We used the following script: 

$waUrl = "http://mysites.company.com"
$wa = Get-SPWebApplication -Identity $waUrl

# Create Service Context for User Profile Manager
$context = Get-SPServiceContext $wa.Sites[0]

# Get User Profile Manager instance
$upm = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)

# Get the user profile for owner of the personal site
$up = $upm.GetUserProfile("newDomain\john.doe")
$up["PersonalSpace"].Value = "/personal/john.doe"
$up.Commit()

Each user is by default the primary site collection administrator of his own MySite. In my former posts I already discussed how we can change the primary site collection administrator with or without elevated permissions. See this posts for reference to change the account to the one from the new domain.

For example, the simplest version:

$targetURL = "http://mysites.company.com/personal/john.doe"
$siteAdmin = New-Object Microsoft.SharePoint.Administration.SPSiteAdministration($targetURL )
$siteAdmin.OwnerLoginName = "newDomain\john.doe"

July 23, 2015

Managing Project Server Views via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 07:17

If you would like to manage Project Server views from code you will find very few helpful resources (if any) on the web. The object models simply do not include classes related to this (neither on the server side nor on the client side). Although the PSI contains a View service, it is intended for internal use. Of course, that intention could not stop us to use the service at our own risk. Below I give you some useful code samples to illustrate the usage of the View service.

First of all, we create the proxy assembly, load the required Microsoft.Office.Project.Server.Library assembly in the process as well, and define some shortcuts to make it easier to reference enum and property values later on.

$pwaUrl = "http://YourProjectServer/pwa"
$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/PSI/View.asmx?wsdl") -UseDefaultCredential
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Project.Server.Library")
$ViewConstants = [Microsoft.Office.Project.Server.Library.ViewConstants]
$ViewType = [Microsoft.Office.Project.Server.Library.ViewConstants+ViewType]

If you now the unique ID of your view, it is easy to display all of the fields and security categories associated with the view:

$viewId = [Guid]"63d3499e-df27-401c-af58-ebb9607beae8"
$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields | % { $_.CONV_STRING }
$view.SecurityCategoryObjects | % { $_.WSEC_CAT_NAME }

If the view ID is unknown, you can get it based on the name and type of the view:

$viewName = "Your Report"
$viewType = $ViewType::PORTFOLIO

$views = $svcPSProxy.ReadViewSummaries()
$viewId = ($views.ViewReports | ? { $_.WVIEW_NAME -eq $viewName -and $_.WVIEW_TYPE -eq $viewType }).WVIEW_UID

You can list all of the views:

$views = $svcPSProxy.ReadViewSummaries()
$views.ViewReports | % {
  Write-Host $_.WVIEW_NAME ($ViewType$_.WVIEW_TYPE)
}

To change the order of the first two fields in the view:

$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields[0].WVIEW_FIELD_ORDER = 1
$view.ViewReportFields[1].WVIEW_FIELD_ORDER = 0
$svcPSProxy.UpdateView($view)

To change the order of two arbitrary fields (based on their name) in the view:

$fieldName1 = "Finish"
$fieldName2 = "Owner"
$view = $svcPSProxy.ReadView($viewId)
$field1 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName1 }
$field2 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName2 }
$field1Order = $field1.WVIEW_FIELD_ORDER
$field2Order = $field2.WVIEW_FIELD_ORDER
$field1.WVIEW_FIELD_ORDER = $field2Order
$field2.WVIEW_FIELD_ORDER = $field1Order
$svcPSProxy.UpdateView($view)

To remove a field from a view:

$fieldToRemoveName = "Ende"
$view = $svcPSProxy.ReadView($viewId)
$fieldToRemove = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldToRemoveName }
$fieldToRemove.Delete()
$svcPSProxy.UpdateView($view)

To delete the view itself:

[Void]$svcPSProxy.DeleteViewReports($viewId)

To create a new view using an existing view as a template:

$newViewName = "New View"
[Void]$svcPSProxy.CopyViewReports($viewId, $newViewName)
$newView = $svcPSProxy.ReadViewSummaries().ViewReports | ? { $_.WVIEW_NAME -eq $newViewName -and $_.WVIEW_TYPE -eq $viewType }

To list all of the fields available in a given type (in this case, for tasks):

$svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | % { $_.CONV_STRING }

To append a new field at the end of the fields in the view:

$fieldToAppendName = "% Work Complete"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }
$view = $svcPSProxy.ReadView($viewId)
$maxFieldOrder = ($view.ViewReportFields | % { $_.WVIEW_FIELD_ORDER } | measure -Maximum).Maximum

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToAppend.WFIELD_TEXTCONV_TYPE
$newField.WTABLE_UID = $fieldToAppend.WTABLE_UID
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToAppend.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToAppend.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToAppend.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $maxFieldOrder + 1
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0

$view.ViewReportFields.AddViewReportFieldsRow($newField)
$svcPSProxy.UpdateView($view)

To inject a new field in the view before another field having a specified name:

$fieldInjectBeforeName = "% Complete"
$fieldToInjectName = "% Work Complete"

$fieldToInject = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToInjectName }

$view = $svcPSProxy.ReadView($viewId)

$fieldInjectBeforeOrder = ($view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldInjectBeforeName }).WVIEW_FIELD_ORDER

$view.ViewReportFields | ? { $_.WVIEW_FIELD_ORDER -ge $fieldInjectBeforeOrder } | % { $_.WVIEW_FIELD_ORDER++ }

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToInject.WFIELD_UID
$newField.CONV_STRING = $fieldToInject.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToInject.WFIELD_TEXTCONV_TYPE
$newField.WTABLE_UID = $fieldToInject.WTABLE_UID
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToInject.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToInject.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToInject.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToInject.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $fieldInjectBeforeOrder
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0

$view.ViewReportFields.AddViewReportFieldsRow($newField)
$svcPSProxy.UpdateView($view)

The last code sample shows how to create a new Gantt-view from scratch, appending a single field and a single security category to it:

$viewRepDS = New-Object PSIProxy.PWAViewReportsDataSet
$newView = $viewRepDS.ViewReports.NewViewReportsRow()
$newView.WVIEW_UID = [Guid]::NewGuid()
$newView.WVIEW_NAME = "New Report 2"
$newView.WVIEW_DESCRIPTION = "Test report description"

$fieldToAppendName = "% Arbeit abgeschlossen"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }

$newField = $viewRepDS.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToAppend.WFIELD_TEXTCONV_TYPE
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToAppend.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToAppend.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToAppend.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $newView.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = 0
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0
$viewRepDS.ViewReportFields.AddViewReportFieldsRow($newField)

$newSecCat = $viewRepDS.SecurityCategoryObjects.NewSecurityCategoryObjectsRow()
$newSecCat.WSEC_CAT_UID = [Microsoft.Office.Project.Server.Library.PSSecurityCategory]::MyProjects
$newSecCat.WSEC_OBJ_TYPE_UID = [Microsoft.Office.Project.Server.Library.PSSecurityObjectType]::View
$newSecCat.WSEC_OBJ_UID = $newView.WVIEW_UID
$viewRepDS.SecurityCategoryObjects.AddSecurityCategoryObjectsRow($newSecCat)

$newView.WVIEW_TYPE = $ViewType::PORTFOLIO
$newView.WVIEW_DISPLAY_TYPE = $ViewConstants::ViewDISPLAYTYPE_GANTT
$newView.WGANTT_SCHEME_UID =  $ViewConstants::GanttSchemeUidProjectCenter
$newView.WVIEW_SPLITTER_POS = 250
#  Group by (see [pub].[MSP_WEB_GROUP_SCHEMES] table in Project DB for possible values)
$newView.WGROUP_SCHEME_UID = [Guid]::Empty

$viewRepDS.ViewReports.AddViewReportsRow($newView)
$svcPSProxy.UpdateView($viewRepDS)

July 22, 2015

Create Project Server Enterprise Custom Fields via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 22:38

Last year I already wrote about how one can manage the Project Server Enterprise Custom Fields via the Managed Client Object Modell. We could transfer the code samples of that post from C# to PowerShell, but because of the limitations of the Managed Client Object Modell I use the PSI interface instead in this case. What are those limitations? Not all of the properties available in PSI are exposed by the Client OM, see for example the MD_PROP_SUMM_GRAPHICAL_INDICATOR field, that we can use to set the rules of graphical indicators defined for the fields. I’ll show you an example for getting and setting the indicator rules in a later post, in the current one I only show you the technique we can use to create the Enterprise Custom Fields via PSI.

One can find an existing description with code sample in Step 3 and 4 of this post, that achieves the same goal, however, I don’t like that approach for several reasons, for example, because of  we have to generate the proxy assembly based on the WSDL in the code itself. Instead of that I find the following code much more simple:

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Project.Server.Library")

$pwaUrl = "http://YourProjectServer/pwa"

# create shortcuts
#
http://stackoverflow.com/a/1049010
$PSDataType = [Microsoft.Office.Project.Server.Library.PSDataType]
$Entities = [Microsoft.Office.Project.Server.Library.EntityCollection]::Entities

$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/psi/CustomFields.asmx?wsdl") -UseDefaultCredential

$customFieldDataSet = New-Object PSIProxy.CustomFieldDataSet 

$customFieldRow = $customFieldDataSet.CustomFields.NewCustomFieldsRow()   
$customFieldRow.MD_PROP_UID = [Guid]::NewGuid()
$customFieldRow.MD_PROP_NAME = "Custom Project Field"
$customFieldRow.MD_PROP_TYPE_ENUM = $PSDataType::STRING
$customFieldRow.MD_ENT_TYPE_UID = $Entities.ProjectEntity.UniqueId
$customFieldRow.MD_PROP_IS_REQUIRED = $false
$customFieldRow.MD_PROP_IS_LEAF_NODE_ONLY = $false
$customFieldRow.MD_PROP_DESCRIPTION = "Test Field Desc."
$customFieldRow.SetMD_LOOKUP_TABLE_UIDNull()
$customFieldRow.SetMD_PROP_DEFAULT_VALUENull()
$customFieldDataSet.CustomFields.AddCustomFieldsRow($customFieldRow)

$svcPSProxy.CreateCustomFields($customFieldDataSet, $false, $true)

If you have casting issues when using the Namespace parameter of the New-WebServiceProxy cmdlet, you should read this post.

Creating a PowerShell-based Monitoring and Alerting System for Project Server

Filed under: PowerShell, Project Server — Tags: , — Peter Holpar @ 22:08

A few months ago I published a post about how to find the jobs in the Project Server queue programmatically. In the current post I will show you, how can you use PowerShell to track the number of jobs in queue, and send an e-mail alert, if the count is higher than a predefined limit for a longer period. Although the example in this post is Project Server specific, you can use the same technique to create other types of alerts as well.

Since the PowerShell script will be run by Windows Task Scheduler (for example on a 5-minute schedule) it was an important question, how to solve the communication between the runs. For example, how the current session can find out, since when the counter is higher than the limit? Of course, if the limit is reached, and we have already sent a mail, we would not like to send further mails for every and each runs till the counter is higher than the limit. But how to inform the forthcoming sessions from the current session, that we have sent a mail? Of course, there are many possible solutions for this problem. We could use a database, or a file (either XML or any custom format) to persist the information between the sessions. I’ve chosen an even simpler approach. I’ve create empty files (QueueLimitReached.txt and MailSent.txt), and check their existence and / or creation date to check when the limit has been reached and if the alert mail has been already sent. If the counter goes below the limit again, I simply delete these semaphore files.

Having this background, the script itself should be already straightforward.

  1. Add-PSSnapin "Microsoft.SharePoint.PowerShell"
  2.  
  3. $folderPath = "D:\ScheduledTasks\"
  4. $limitReachedFileName = "QueueLimitReached.txt"
  5. $mailSentFileName = "MailSent.txt"
  6. $ageOfFileLimit = 15 # in minutes
  7. $counterValueLimit = 50
  8.  
  9. $emailTo = "admins@company.com"
  10. $emailCc = "helpdesk@company.com;projmans@company.com"
  11. $emailSubject = "Project Server Queue Alert"
  12. $emailBody = @"
  13. Hi,
  14.  
  15. the count of the jobs in the Project Server Queue is very high. Please, fix the issue!
  16.  
  17. Regards,
  18. The PowerShell Monitor
  19.   "@
  20.  
  21. $limitReachedFilePath = $folderPath + $limitReachedFileName
  22. $mailSentFilePath = $folderPath + $mailSentFileName
  23.  
  24. function HasAlertState()
  25. {
  26.   $counter = Get-Counter -Counter "\ProjectServer:QueueGeneral(_Total)\Current Unprocessed Jobs"
  27.   $counterValue = $counter.CounterSamples[0].CookedValue
  28.   return ($counterValue -gt $counterValueLimit)
  29. }
  30.  
  31. function SendAlert()
  32. {   
  33.   $globalAdmin = New-Object Microsoft.SharePoint.Administration.SPGlobalAdmin
  34.  
  35.   $smtpMail = New-Object Net.Mail.MailMessage
  36.   $smtpMail.From = $globalAdmin.MailFromAddress
  37.   $smtpMail.Subject = $emailSubject
  38.   $smtpMail.Body = $emailBody
  39.   $emailTo.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.To.Add($mailAddr) }
  40.   $emailCc.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.Cc.Add($mailAddr) }
  41.   $smtpMail.ReplyTo = New-Object Net.Mail.MailAddress($globalAdmin.MailReplyToAddress)
  42.   $smtpMail.BodyEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  43.   $smtpMail.SubjectEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  44.  
  45.   $smtpClient = New-Object Net.Mail.SmtpClient($globalAdmin.OutboundSmtpServer)
  46.   $smtpClient.Send($smtpMail)
  47. }
  48.  
  49. $alertCondition = HasAlertState
  50.  
  51. If ($alertCondition)
  52. {
  53.   If (Test-Path $limitReachedFilePath)
  54.   {
  55.     $creationTime = (Get-ChildItem $limitReachedFilePath).CreationTime
  56.     $ageOfFile = ([DateTime]::Now – $creationTime).Minutes
  57.     Write-Host $ageOfFile
  58.     If ($ageOfFile -gt $ageOfFileLimit)
  59.     {
  60.       Write-Host Limit reached
  61.       If (-not (Test-Path $mailSentFilePath))
  62.       {
  63.         Write-Host Mail has not yet been sent. Send it now.
  64.         SendAlert
  65.         # suppress return value via casting it to null
  66.         [void] (New-Item -name $mailSentFileName -path $folderPath -itemType File)
  67.       }
  68.     }
  69.   }
  70.   # create a new file, if no former one exists
  71.   else
  72.   {
  73.     If (-not (Test-Path $limitReachedFilePath))
  74.     {
  75.       # suppress return value via casting it to null
  76.       [void] (New-Item -name $limitReachedFileName -path $folderPath -itemType File)
  77.     }
  78.   }
  79. }
  80. # delete the former files, if they exist
  81. Else
  82. {
  83.   If (Test-Path $limitReachedFilePath)
  84.   {
  85.     Remove-Item $limitReachedFilePath
  86.   }
  87.   If (Test-Path $mailSentFilePath)
  88.   {
  89.     Remove-Item $mailSentFilePath
  90.   }
  91. }

In the sample we check the value of the Current Unprocessed Jobs counter of Project Server. You can easily change the limit of  job count (50), and the time period (15 minutes) in the code, or customize the addressees, subject and body of the mail. If you would like to create other types of alerts, you should simply implement your own version of the HasAlertState method.

July 19, 2015

How to restrict the available users in a ‘Person or Group’ field to Project Server resources?

Filed under: Managed Client OM, PowerShell, PS 2013, SP 2013 — Tags: , , , — Peter Holpar @ 21:33

Assume you have a task list in your Project Web Access (PWA) site or on one of the Project Web Sites (PWS) in your Project Server and you would like to restrict the users available in the Assigned To field (field type of  ‘Person or Group‘) to users who are specified as Enterprise Resource in Project Server, that is running in the “classical” Project Server permission mode, and not in the new SharePoint Server permission mode. There is no synchronization configured between Active Directory groups and Project Server resources.

You can limit a ‘Person or Group‘ field to a specific SharePoint group, but there is no built-in solution to sync enterprise resources to SharePoint groups. In this post I show you, how to achieve that via PowerShell and the managed client object models of Project Server and SharePoint.

Note: You could get the login names of users assigned to the enterprise resources via REST as well (http://YourProjectServer/PWA/_api/ProjectServer/EnterpriseResources?$expand=User&$select=User/LoginName), but in my sample I still use the client object model of  Project Server.

My goal was to create a PowerShell solution, because it makes it easy to change the code on the server without any kind of compiler. I first created a C# solution, because the language elements of C# (like extension methods, generics and LINQ) help us to write compact, effective and readable code. For example, since the language elements of PowerShell do not support the LINQ expressions, you cannot simply restrict the elements and their properties returned by a client object model request, as I illustrated my former posts here, here and here. Having the working C# source code, I included it in my PowerShell script as literal string and built the .NET application at runtime, just as I illustrated in this post. In the C# code I utilized an extension method to help automated batching of the client object model request. More about this solution can be read here.

The logic of the synchronization is simple: we read the list of all non-generic enterprise resources, and store the login names (it the user exists) as string in a generic list. Then read the members of the SharePoint group we are synchronizing and store their login names as string in another generic list. Finally, we add the missing users to the SharePoint group and remove the extra users from the group.

The final code is included here:

  1. $pwaUrl = "http://YourProjectServer/PWA";
  2. $grpName = "AllResourcesGroup";
  3.  
  4. $referencedAssemblies = (
  5.     "Microsoft.SharePoint.Client, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  6.     "Microsoft.SharePoint.Client.Runtime, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  7.     "Microsoft.ProjectServer.Client, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  8.     "System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089")
  9.  
  10. $sourceCode = @"
  11. using System;
  12. using System.Linq;
  13. using System.Collections.Generic;
  14. using Microsoft.SharePoint.Client;
  15. using Microsoft.ProjectServer.Client;
  16.  
  17. public static class Extensions
  18. {
  19.     // batching to avoid
  20.     // Microsoft.SharePoint.Client.ServerException: The request uses too many resources
  21.     // https://msdn.microsoft.com/en-us/library/office/jj163082.aspx
  22.     public static void ExecuteQueryBatch<T>(this ClientContext clientContext, IEnumerable<T> itemsToProcess, Action<T> action, int batchSize)
  23.     {
  24.         var counter = 1;
  25.  
  26.         foreach (var itemToProcess in itemsToProcess)
  27.         {
  28.             action(itemToProcess);
  29.             counter++;
  30.  
  31.             if (counter > batchSize)
  32.             {
  33.                 clientContext.ExecuteQuery();
  34.                 counter = 1;
  35.             }
  36.         }
  37.  
  38.         if (counter > 1)
  39.         {
  40.             clientContext.ExecuteQuery();
  41.         }
  42.     }
  43. }
  44.  
  45. public static class Helper
  46. {
  47.     public static void SyncGroupMembers(string pwaUrl, string grpName)
  48.     {
  49.         List<string> resLogins = new List<string>();
  50.         List<string> grpLogins = new List<string>();
  51.         var batchSize = 20;
  52.  
  53.         using (var projectContext = new ProjectContext(pwaUrl))
  54.         {
  55.             var resources = projectContext.EnterpriseResources;
  56.             projectContext.Load(resources, rs => rs.Where(r => !r.IsGeneric).Include(r => r.User.LoginName));
  57.  
  58.             projectContext.ExecuteQuery();
  59.  
  60.             resLogins.AddRange(resources.ToList().Where(r => r.User.ServerObjectIsNull == false).ToList().Select(r => r.User.LoginName.ToLower()));               
  61.         }
  62.         using (var clientContext = new ClientContext(pwaUrl))
  63.         {
  64.             var web = clientContext.Web;
  65.  
  66.             var grp = web.SiteGroups.GetByName(grpName);
  67.             clientContext.Load(grp, g => g.Users.Include(u => u.LoginName));
  68.  
  69.             clientContext.ExecuteQuery();
  70.  
  71.             grpLogins.AddRange(grp.Users.ToList().ToList().Select(u => u.LoginName.ToLower()));
  72.  
  73.             var usersToAdd = resLogins.Where(l => !grpLogins.Contains(l));
  74.             clientContext.ExecuteQueryBatch<string>(usersToAdd,
  75.                 new Action<string>(loginName =>
  76.                 {
  77.                     var user = web.EnsureUser(loginName);
  78.                     grp.Users.AddUser(user);
  79.                 }),
  80.                 batchSize);
  81.  
  82.             var usersToRemove = grpLogins.Where(l => !resLogins.Contains(l));
  83.             clientContext.ExecuteQueryBatch<string>(usersToRemove,
  84.                 new Action<string>(loginName =>
  85.                 {
  86.                     grp.Users.RemoveByLoginName(loginName);
  87.                 }),
  88.                 batchSize);
  89.         }
  90.     }
  91. }
  92.  
  93. "@
  94. Add-Type -ReferencedAssemblies $referencedAssemblies -TypeDefinition $sourceCode -Language CSharp;
  95.  
  96. [Helper]::SyncGroupMembers($pwaUrl, $grpName)

How to avoid ‘The request uses too many resources’ when using the client object model via automated batching of commands

Filed under: Managed Client OM, SP 2013 — Tags: , — Peter Holpar @ 21:32

One of the reasons, I prefer the client object model of SharePoint to the REST interface, is its capability of batching requests.

For example, you can add multiple users to a SharePoint group using the code below, and it is sent as a single request to the server:

  1. using (var clientContext = new ClientContext(url))
  2. {
  3.     var web = clientContext.Web;
  4.     var grp = web.SiteGroups.GetByName("YourGroup");
  5.  
  6.     var usersToAdd = new List<string>() { @"i:0#.w|domain\user1", @"i:0#.w|domain\user2" };
  7.  
  8.     foreach (var loginName in usersToAdd)
  9.     {
  10.         var user = web.EnsureUser(loginName);
  11.         grp.Users.AddUser(user);
  12.     }
  13.  
  14.     clientContext.ExecuteQuery();
  15. }

However, as the number of  users you would like to add increases, you might have issues, as the operational requests in your batch are exceeding the 2 MB limit.

How could we solve the problem relative painless, avoiding the error, and still keeping our code readable?

The good news is that it is easy to achieve using extension method, generic, and the Action class. We can extend the ClientContext with an ExecuteQueryBatch method, and pass the list of parameter values to be processed in an IEnumerable, the action to be performed, and the count of  items should be processed in a single batch. The method splits the parameter values into batches, calling the ExecuteQuery method on the ClientContext for each batch.

If the action, you would perform on the client objects has a single parameter (as in our case above, the login name is a single parameter of type String), the ExecuteQueryBatch method can be defined as:

  1. public static class Extensions
  2. {
  3.     public static void ExecuteQueryBatch<T>(this ClientContext clientContext, IEnumerable<T> itemsToProcess, Action<T> action, int batchSize)
  4.     {
  5.         var counter = 1;
  6.  
  7.         foreach (var itemToProcess in itemsToProcess)
  8.         {
  9.             action(itemToProcess);
  10.  
  11.             counter++;
  12.             if (counter > batchSize)
  13.             {
  14.                 clientContext.ExecuteQuery();
  15.                 counter = 1;
  16.             }
  17.         }
  18.  
  19.         if (counter > 1)
  20.         {
  21.             clientContext.ExecuteQuery();
  22.         }
  23.     }
  24. }

Having the ExecuteQueryBatch method in this form, the original code can be modified:

  1. var batchSize = 20;
  2.  
  3. using (var clientContext = new ClientContext(url))
  4. {
  5.     var web = clientContext.Web;
  6.     var grp = web.SiteGroups.GetByName("YourGroup");
  7.  
  8.     var usersToAdd = new List<string>() { @"i:0#.w|domain\user1", @"i:0#.w|domain\user2" /* and a lot of other logins */ };
  9.  
  10.     clientContext.ExecuteQueryBatch<string>(usersToAdd,
  11.         new Action<string>(loginName =>
  12.         {
  13.             var user = web.EnsureUser(loginName);
  14.             grp.Users.AddUser(user);
  15.         }),
  16.         batchSize);
  17.  
  18.     clientContext.ExecuteQuery();
  19. }

The size of batch you can use depends on the complexity of the action. For a complex action should be the batch smaller. The ideal value should you find experimentally.

Actions with multiple parameter require additional overloads of the the ExecuteQueryBatch extension method.

In my next post I’ll illustrate how to utilize this extension method in a practical example.

July 16, 2015

How to Read Project Properties that are not Available in the Client Object Model?

Recently I had a development task that at the first sight seemed to be trivial, but it turned out quickly to be rather a challenge. I had to display some basic project information on a page in our Project Web Site (PWS), like project start and finish date, remaining work and percent complete. The web page was built using client-side technologies, like the Client-side object model (CSOM) for Project 2013 and using the AngularJS library, and we did not plan to change the architecture to server side code.

If you check the properties of the PublishedProject (either on the client side in namespace / assembly Microsoft.ProjectServer.Client or on the server side in Microsoft.ProjectServer), you see that it has properties like StartDate and FinishDate, and it inherits its PercentComplete property from the Project base class, however there is no property for RemainingWork or PercentWorkComplete defined, although both of these values are available as fields if you manage a Project Server view (see screenshot below). This information is not available via REST / OData either.

image

You should know, that in the case of  Project Server, the server side OM is simply a wrapper around the PSI, for example, the PercentComplete property in the Project class is defined:

public int PercentComplete
{
  get
  {
    ProjectDataSet.TaskRow summaryTaskRow = this.SummaryTaskRow;
    if (summaryTaskRow != null && !summaryTaskRow.IsTASK_PCT_COMPNull())
      return summaryTaskRow.TASK_PCT_COMP;
    else
      return 0;
  }
}

Client side OMs (either managed or ECMAScript) and REST calls invoke the server side OM, so at the background the good old PSI is still in action.

It seems that the developers of Project Server remained simply not enough time to map all of the fields available via PSI to the object models on the server side and the client side.

You should know either, that the project properties we need are stored as task properties for the project summary task of the current project. In the Project Server database the tasks of the published projects (so the project summary tasks as well) are stored in the [pub].[MSP_TASKS] table. If you run the following query (where ProjectWebApp is the name of the database and the Guid in the [PROJ_UID] filter is the ID of your project), you find some specific field values that may help to identify the summary task record of a project:

SELECT [TASK_UID]    
      ,[TASK_PARENT_UID]
      ,[TASK_ID]
      ,[TASK_OUTLINE_NUM]
      ,[TASK_OUTLINE_LEVEL]
      ,[TASK_NAME]
      ,[TASK_START_DATE]
      ,[TASK_FINISH_DATE]
      ,[TASK_PCT_COMP]
      ,[TASK_PCT_WORK_COMP]
      ,[TASK_REM_WORK]
  FROM [ProjectWebApp].[pub].[MSP_TASKS]
  WHERE [PROJ_UID] = ‘d0ae5086-be7a-e411-9568-005056b45654’

The project summary task record – at least, based on my experimental results – , matches the following conditions:

[TASK_ID] = 0

[TASK_OUTLINE_NUM] = 0

[TASK_OUTLINE_LEVEL] = 0

[TASK_UID] = [TASK_PARENT_UID]

But as said, we need a solution on the client side, and obviously one that does not tamper with the Project Server database. What options are there to achieve the missing information?

The Project class has a property called SummaryTaskId, but if you have this value already, and would like to query the project tasks via REST (for example: http://YourProjServer/PWA/_api/ProjectServer/Projects(‘d0ae5086-be7a-e411-9568-005056b45654&#8217;)/Tasks(‘FFAE5086-BE7A-E411-9568-005056B45654’)) or via the client object model, the result is empty. The description of the SummaryTaskId property says: “Gets the GUID for the hidden project summary task”. Yes, it is so hidden, that it simply not included in the Tasks collection of the Project class! The Tasks property of the PublishedProject class is of type PublishedTaskCollection, and on the server side the record for the project summary task is simply filtered out, when initializing the internal Dictionary used for the storage of the Task records. If you don’t believe me, or need more details on that, see the constructor method of Microsoft.ProjectServer.PublishedTaskCollection class below:

internal PublishedTaskCollection()
{
    Func<Dictionary<Guid, PublishedTask>> valueFactory = null;
    if (valueFactory == null)
    {
        valueFactory = () => base.ProjectData.Task.OfType<ProjectDataSet.TaskRow>().Where<ProjectDataSet.TaskRow>(delegate (ProjectDataSet.TaskRow r) {
            if (!r.IsTASK_PARENT_UIDNull())
            {
                return (r.TASK_PARENT_UID != r.TASK_UID);
            }
            return true;
        }).ToDictionary<ProjectDataSet.TaskRow, Guid, PublishedTask>(r => r.TASK_UID, r => this.CreateTask(r));
    }
    this._tasks = new Lazy<Dictionary<Guid, PublishedTask>>(valueFactory);
}

Of course, we get the same, empty result if we would like to filter the tasks for one the special conditions we found in the database (like [TASK_OUTLINE_LEVEL] = 0):
http://YourProjServer/PWA/_api/ProjectServer/Projects(‘d0ae5086-be7a-e411-9568-005056b45654&#8217;)/Tasks?$filter=OutlineLevel eq 0 

The project reporting data contains the project summary tasks as well, so we could invoke the ProjectData OData endpoint from the client side to query the required information. The problem with this approach is that it would require extra permissions on the reporting data and one cannot limit this permission to the summary tasks of a specific project, to summary tasks, or just to tasks at all. If you grant your users the Access Project Server Reporting Service global permission, they can query all of the reporting data. It is sure not our goal, but you can test it if you wish.

Once you have the ID of the project summary task (for example via the SummaryTaskId property), the task is available via a query like this one:

http://YourProjServer/PWA/_api/ProjectData/Tasks(ProjektID=guid’d0ae5086-be7a-e411-9568-005056b45654&#8242;,TaskID=guid’FFAE5086-BE7A-E411-9568-005056B45654′)

When using PSI, we can access the required information via the TASK_REM_WORK and TASK_PCT_WORK_COMP fields in ProjectDataSet.TaskRow, that means, rows in the Task property (type of  ProjectDataSet.TaskDataTable) of the ProjectDataSet. The first row in the record set contains the information about the project summary task.

We could create our own extensions for the client object model (wrapping around just this piece of  PSI), as I illustrated for the managed, and for the ECMAScript object model as well, but it would require a lot of work, so I ignored this option for now. Instead of this, I’ve created a simple .NET console application utilizing the PSI (see the most important part of the code below). Unfortunately, I have not found a method that returns only a specific task of a specific project, so I had to call the ReadProjectEntities method to read all of the tasks of the project.

  1. _projectClient = new SvcProject.ProjectClient(ENDPOINT_PROJECT, pwaUrl + "/_vti_bin/PSI/ProjectServer.svc");
  2. _projectClient.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
  3.  
  4. Guid projId = Guid.Parse("d0ae5086-be7a-e411-9568-005056b45654");
  5. int taskEntityId = 2;
  6.  
  7. var projEntitiesDS = _projectClient.ReadProjectEntities(projId, taskEntityId, SvcProject.DataStoreEnum.PublishedStore);
  8. var tasksTable = projEntitiesDS.Task;
  9.  
  10. foreach (SvcProject.ProjectDataSet.TaskRow task in tasksTable.Rows)
  11. {
  12.     Console.WriteLine(string.Format("TASK_OUTLINE_NUM: {0}; TASK_PCT_WORK_COMP: {1}; TASK_REM_WORK: {2}", task.TASK_OUTLINE_NUM, task.TASK_PCT_WORK_COMP, task.TASK_REM_WORK));
  13. }

I’ve captured the request and the response using Fiddler:

image

Then extended my JavaScript code with methods that assemble the request in the same format, submit it to the server, then parse the required fields out of the response.

First, I needed a helper method to format strings:

  1. String.format = (function () {
  2.     // The string containing the format items (e.g. "{0}")
  3.     // will and always has to be the first argument.
  4.     var result = arguments[0];
  5.  
  6.     // start with the second argument (i = 1)
  7.     for (var i = 1; i < arguments.length; i++) {
  8.         // "gm" = RegEx options for Global search (more than one instance)
  9.         // and for Multiline search
  10.         var regEx = new RegExp("\\{" + (i – 1) + "\\}", "gm");
  11.         result = result.replace(regEx, arguments[i]);
  12.     }
  13.  
  14.     return result;
  15. });

In my Angular controller I defined this function to format dates:

  1. $scope.formatDate = function (date) {
  2.     var formattedDate = '';
  3.     if ((typeof date != "undefined") && (date.year() > 1)) {
  4.         formattedDate = String.format("{0}.{1}.{2}", date.year(), date.month() + 1, date.date());
  5.     }
  6.  
  7.     return formattedDate;
  8. }

Next, in the controller we get the ID of the project for the current PWS, and we read project properties that are available via the client object model, and finally the ones, that are available only via PSI:

  1. var promiseWebProps = ProjService.getWebProps($scope);
  2. promiseWebProps.then(function (props) {
  3.     $scope.projectId = props.projectId;
  4.  
  5.     // read the project properties that are available via the client object model
  6.     var promiseProjProp = ProjService.getProjectProps($scope);
  7.     promiseProjProp.then(function (props) {
  8.         $scope.projStartDate = moment(props.projStartDate);
  9.         $scope.projFinishDate = moment(props.projFinishDate);
  10.         $scope.percentComp = props.percentComp;
  11.     }, function (errorMsg) {
  12.         console.log("Error: " + errorMsg);
  13.     });
  14.  
  15.     // read the project properties that are available only via PSI
  16.     var promiseProjPropEx = ProjService.getProjectPropsEx($scope);
  17.     promiseProjPropEx.then(function (propsEx) {
  18.         $scope.remainingWork = Math.round(propsEx.remainingWork / 600) / 100;
  19.         $scope.percentWorkComp = propsEx.percentWorkComp;
  20.     }, function (errorMsg) {
  21.         console.log("Error: " + errorMsg);
  22.     });
  23.  
  24. }, function (errorMsg) {
  25.     console.log("Error: " + errorMsg);
  26. });

As you can see, the value we receive in the remainingWork property should be divided by 600 and 100 to get the value in hours.

In our custom ProjService service I’ve implemented the corresponding methods.

The project ID is stored in the property bag of the PWS in a property called MSPWAPROJUID (see this post about how to read property bags from the client object model):

  1. this.getWebProps = function ($scope) {
  2.     var deferred = $q.defer();
  3.  
  4.     var ctx = SP.ClientContext.get_current();
  5.  
  6.     var web = ctx.get_web();
  7.     var props = web.get_allProperties();
  8.     ctx.load(props);
  9.  
  10.  
  11.     ctx.executeQueryAsync(
  12.         function () {
  13.             var allProps = props.get_fieldValues();
  14.  
  15.             deferred.resolve(
  16.                 {
  17.                     projectId: allProps.MSPWAPROJUID
  18.                 });
  19.         },
  20.         function (sender, args) {
  21.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  22.         }
  23.     );
  24.  
  25.     return deferred.promise;
  26. };

Having the project ID, reading project properties via the client object model should be straightforward as well:

  1. this.getProjectProps = function ($scope) {
  2.     var deferred = $q.defer();
  3.  
  4.     var ctx = SP.ClientContext.get_current();
  5.  
  6.     var projContext = PS.ProjectContext.get_current();
  7.  
  8.     projContext.set_isPageUrl(ctx.get_isPageUrl);
  9.     var proj = projContext.get_projects().getById($scope.projectId);
  10.     projContext.load(proj, "StartDate", "FinishDate", "PercentComplete");
  11.  
  12.     projContext.executeQueryAsync(
  13.         function () {
  14.             deferred.resolve({
  15.                 projStartDate: proj.get_startDate(),
  16.                 projFinishDate: proj.get_finishDate(),
  17.                 percentComp: proj.get_percentComplete()
  18.             });
  19.         },
  20.         function (sender, args) {
  21.             deferred.reject('Request failed. ' + args.get_message() + '\n' + args.get_stackTrace());
  22.         }
  23.     );
  24.  
  25.     return deferred.promise;
  26. };

Reading the ‘extra’ properties via PSI is a bit more complex. First, we assemble the request XML as we captured it with Fiddler when used the console application mentioned above, and post it to the server. Next, we process the response (see the code of the helper method buildXMLFromString farther below), and parse out the necessary properties from the project summary task (that is the Task node having rowOrder = 0) using XPath queries.

  1. this.getProjectPropsEx = function () {
  2.     var deferred = $q.defer();
  3.    
  4.     // assuming your PWA is located at /PWA
  5.     var psiUrl = String.format("{0}//{1}/PWA/_vti_bin/PSI/ProjectServer.svc", window.location.protocol, window.location.host);
  6.    
  7.     $http({
  8.         method: 'POST',
  9.         url: psiUrl,
  10.         data: String.format('<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/"><s:Body xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance&quot; xmlns:xsd="http://www.w3.org/2001/XMLSchema"><ReadProjectEntities xmlns="http://schemas.microsoft.com/office/project/server/webservices/Project/"><projectUid&gt;{0}</projectUid><ProjectEntityType>2</ProjectEntityType><dataStore>PublishedStore</dataStore></ReadProjectEntities></s:Body></s:Envelope>', $scope.projectId),
  11.         headers: {
  12.             "Content-Type": 'text/xml; charset=utf-8',
  13.             "SOAPAction": "http://schemas.microsoft.com/office/project/server/webservices/Project/ReadProjectEntities&quot;
  14.         }
  15.     }).success(function (data) {
  16.         var dataAsXml = buildXMLFromString(data);
  17.         dataAsXml.setProperty('SelectionLanguage', 'XPath');
  18.         dataAsXml.setProperty('SelectionNamespaces', 'xmlns:pds="http://schemas.microsoft.com/office/project/server/webservices/ProjectDataSet/&quot; xmlns:msdata="urn:schemas-microsoft-com:xml-msdata"');
  19.         var projSumTaskNode = dataAsXml.selectSingleNode("//pds:Task[@msdata:rowOrder=0]");
  20.         var remainingWork = projSumTaskNode.selectSingleNode("pds:TASK_REM_WORK").nodeTypedValue;
  21.         var percentWorkComp = projSumTaskNode.selectSingleNode("pds:TASK_PCT_WORK_COMP").nodeTypedValue;
  22.         deferred.resolve(
  23.             {
  24.                 remainingWork: remainingWork,
  25.                 percentWorkComp: percentWorkComp
  26.             });
  27.     })
  28.     .error(function (data, status) {
  29.         deferred.reject('Request failed. ' + data);
  30.     });
  31.     
  32.     return deferred.promise;
  33. }

These are the helper methods I used for processing the response text as XML:

  1. function createMSXMLDocumentObject() {
  2.     if (typeof (ActiveXObject) != "undefined") {
  3.         // http://blogs.msdn.com/b/xmlteam/archive/2006/10/23/using-the-right-version-of-msxml-in-internet-explorer.aspx
  4.         var progIDs = [
  5.                         "Msxml2.DOMDocument.6.0",
  6.                         "Msxml2.DOMDocument.3.0",
  7.                         "MSXML.DOMDocument"
  8.         ];
  9.         for (var i = 0; i < progIDs.length; i++) {
  10.             try {
  11.                 return new ActiveXObject(progIDs[i]);
  12.             } catch (e) { };
  13.         }
  14.     }
  15.  
  16.     return null;
  17. }
  18.  
  19. function buildXMLFromString(text) {
  20.     var xmlDoc;
  21.  
  22.     xmlDoc = createMSXMLDocumentObject();
  23.     if (!xmlDoc) {
  24.         alert("Cannot create XMLDocument object");
  25.         return null;
  26.     }
  27.  
  28.     xmlDoc.loadXML(text);
  29.  
  30.     var errorMsg = null;
  31.     if (xmlDoc.parseError && xmlDoc.parseError.errorCode != 0) {
  32.         errorMsg = "XML Parsing Error: " + xmlDoc.parseError.reason
  33.                     + " at line " + xmlDoc.parseError.line
  34.                     + " at position " + xmlDoc.parseError.linepos;
  35.     }
  36.     else {
  37.         if (xmlDoc.documentElement) {
  38.             if (xmlDoc.documentElement.nodeName == "parsererror") {
  39.                 errorMsg = xmlDoc.documentElement.childNodes[0].nodeValue;
  40.             }
  41.         }
  42.         else {
  43.             errorMsg = "XML Parsing Error!";
  44.         }
  45.     }
  46.  
  47.     if (errorMsg) {
  48.         alert(errorMsg);
  49.         return null;
  50.     }
  51.  
  52.     return xmlDoc;
  53. }

Having an HTML template like this one:

  1. <div><span>% complete:</span><span>{{percentComp}}%</span></div>
  2. <div><span>% work complete:</span><span>{{percentWorkComp}}%</span></div>
  3. <div><span>Remaining work:</span><span>{{remainingWork}} Hours</span></div>
  4. <div><span>Project start:</span><span>{{formatDate(projStartDate)}}</span></div>
  5. <div><span>Project finish:</span><span>{{formatDate(projFinishDate)}}</span></div>

the result should be displayed similar to this one:

image

A drawback of this approach (not to mention the fact that it is pretty hacky) is, that due the ReadProjectEntities method, all of the fields of all of the project tasks should be downloaded to the client, although we need only a few fields of a single task, the project summary task. So it would make sense to implement some kind of  caching on the client side, but it is out of the scope of this post. But as long as Microsoft does not provide all the project fields in the client object model, I have not found any better solution that would require a relative small effort.

Older Posts »

The Shocking Blue Green Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 58 other followers