Second Life of a Hungarian SharePoint Geek

July 23, 2015

Managing Project Server Views via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 07:17

If you would like to manage Project Server views from code you will find very few helpful resources (if any) on the web. The object models simply do not include classes related to this (neither on the server side nor on the client side). Although the PSI contains a View service, it is intended for internal use. Of course, that intention could not stop us to use the service at our own risk. Below I give you some useful code samples to illustrate the usage of the View service.

First of all, we create the proxy assembly, load the required Microsoft.Office.Project.Server.Library assembly in the process as well, and define some shortcuts to make it easier to reference enum and property values later on.

$pwaUrl = "http://YourProjectServer/pwa"
$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/PSI/View.asmx?wsdl") -UseDefaultCredential
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Project.Server.Library")
$ViewConstants = [Microsoft.Office.Project.Server.Library.ViewConstants]
$ViewType = [Microsoft.Office.Project.Server.Library.ViewConstants+ViewType]

If you now the unique ID of your view, it is easy to display all of the fields and security categories associated with the view:

$viewId = [Guid]"63d3499e-df27-401c-af58-ebb9607beae8"
$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields | % { $_.CONV_STRING }
$view.SecurityCategoryObjects | % { $_.WSEC_CAT_NAME }

If the view ID is unknown, you can get it based on the name and type of the view:

$viewName = "Your Report"
$viewType = $ViewType::PORTFOLIO

$views = $svcPSProxy.ReadViewSummaries()
$viewId = ($views.ViewReports | ? { $_.WVIEW_NAME -eq $viewName -and $_.WVIEW_TYPE -eq $viewType }).WVIEW_UID

You can list all of the views:

$views = $svcPSProxy.ReadViewSummaries()
$views.ViewReports | % {
  Write-Host $_.WVIEW_NAME ($ViewType$_.WVIEW_TYPE)
}

To change the order of the first two fields in the view:

$view = $svcPSProxy.ReadView($viewId)
$view.ViewReportFields[0].WVIEW_FIELD_ORDER = 1
$view.ViewReportFields[1].WVIEW_FIELD_ORDER = 0
$svcPSProxy.UpdateView($view)

To change the order of two arbitrary fields (based on their name) in the view:

$fieldName1 = "Finish"
$fieldName2 = "Owner"
$view = $svcPSProxy.ReadView($viewId)
$field1 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName1 }
$field2 = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldName2 }
$field1Order = $field1.WVIEW_FIELD_ORDER
$field2Order = $field2.WVIEW_FIELD_ORDER
$field1.WVIEW_FIELD_ORDER = $field2Order
$field2.WVIEW_FIELD_ORDER = $field1Order
$svcPSProxy.UpdateView($view)

To remove a field from a view:

$fieldToRemoveName = "Ende"
$view = $svcPSProxy.ReadView($viewId)
$fieldToRemove = $view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldToRemoveName }
$fieldToRemove.Delete()
$svcPSProxy.UpdateView($view)

To delete the view itself:

[Void]$svcPSProxy.DeleteViewReports($viewId)

To create a new view using an existing view as a template:

$newViewName = "New View"
[Void]$svcPSProxy.CopyViewReports($viewId, $newViewName)
$newView = $svcPSProxy.ReadViewSummaries().ViewReports | ? { $_.WVIEW_NAME -eq $newViewName -and $_.WVIEW_TYPE -eq $viewType }

To list all of the fields available in a given type (in this case, for tasks):

$svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | % { $_.CONV_STRING }

To append a new field at the end of the fields in the view:

$fieldToAppendName = "% Work Complete"

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }
$view = $svcPSProxy.ReadView($viewId)
$maxFieldOrder = ($view.ViewReportFields | % { $_.WVIEW_FIELD_ORDER } | measure -Maximum).Maximum

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToAppend.WFIELD_TEXTCONV_TYPE
$newField.WTABLE_UID = $fieldToAppend.WTABLE_UID
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToAppend.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToAppend.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToAppend.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $maxFieldOrder + 1
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0

$view.ViewReportFields.AddViewReportFieldsRow($newField)
$svcPSProxy.UpdateView($view)

To inject a new field in the view before another field having a specified name:

$fieldInjectBeforeName = "% Complete"
$fieldToInjectName = "% Work Complete"

$fieldToInject = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToInjectName }

$view = $svcPSProxy.ReadView($viewId)

$fieldInjectBeforeOrder = ($view.ViewReportFields | ? { $_.CONV_STRING -eq $fieldInjectBeforeName }).WVIEW_FIELD_ORDER

$view.ViewReportFields | ? { $_.WVIEW_FIELD_ORDER -ge $fieldInjectBeforeOrder } | % { $_.WVIEW_FIELD_ORDER++ }

$newField = $view.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToInject.WFIELD_UID
$newField.CONV_STRING = $fieldToInject.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToInject.WFIELD_TEXTCONV_TYPE
$newField.WTABLE_UID = $fieldToInject.WTABLE_UID
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToInject.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToInject.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToInject.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToInject.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $view.ViewReports.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = $fieldInjectBeforeOrder
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0

$view.ViewReportFields.AddViewReportFieldsRow($newField)
$svcPSProxy.UpdateView($view)

The last code sample shows how to create a new Gantt-view from scratch, appending a single field and a single security category to it:

$viewRepDS = New-Object PSIProxy.PWAViewReportsDataSet
$newView = $viewRepDS.ViewReports.NewViewReportsRow()
$newView.WVIEW_UID = [Guid]::NewGuid()
$newView.WVIEW_NAME = "New Report 2"
$newView.WVIEW_DESCRIPTION = "Test report description"

$fieldToAppendName = "% Arbeit abgeschlossen"
$viewType = $ViewType::PORTFOLIO

$fieldToAppend = $svcPSProxy.ReadProjectFields($ViewConstants::ViewTABLE_TASK_UID ).ViewFields | ? { $_.CONV_STRING -eq $fieldToAppendName }

$newField = $viewRepDS.ViewReportFields.NewViewReportFieldsRow()

$newField.WFIELD_UID = $fieldToAppend.WFIELD_UID
$newField.CONV_STRING = $fieldToAppend.CONV_STRING
$newField.WFIELD_TEXTCONV_TYPE = $fieldToAppend.WFIELD_TEXTCONV_TYPE
$newField.WFIELD_IS_CUSTOM_FIELD = $fieldToAppend.WFIELD_IS_CUSTOM_FIELD
$newField.WFIELD_NAME_SQL = $fieldToAppend.WFIELD_NAME_SQL
$newField.WFIELD_IS_MULTI_VALUE = $fieldToAppend.WFIELD_IS_MULTI_VALUE
$newField.WFIELD_LOOKUP_TABLE_UID = $fieldToAppend.WFIELD_LOOKUP_TABLE_UID
$newField.WVIEW_UID = $newView.WVIEW_UID
$newField.WVIEW_FIELD_ORDER = 0
$newField.WVIEW_FIELD_WIDTH = 100
$newField.WVIEW_FIELD_AUTOSIZE = 1
$newField.WVIEW_FIELD_CUSTOM_LABEL = [System.DBNull]::Value
$newField.WVIEW_FIELD_IS_READ_ONLY = 0

$newSecCat = $viewRepDS.SecurityCategoryObjects.NewSecurityCategoryObjectsRow()
$newSecCat.WSEC_CAT_UID = [Microsoft.Office.Project.Server.Library.PSSecurityCategory]::MyProjects
$newSecCat.WSEC_OBJ_TYPE_UID = [Microsoft.Office.Project.Server.Library.PSSecurityObjectType]::View
$newSecCat.WSEC_OBJ_UID = $newView.WVIEW_UID
$viewRepDS.SecurityCategoryObjects.AddSecurityCategoryObjectsRow($newSecCat)

$newView.WVIEW_TYPE = $ViewType::PORTFOLIO
$newView.WVIEW_DISPLAY_TYPE = $ViewConstants::ViewDISPLAYTYPE_GANTT
$newView.WGANTT_SCHEME_UID =  $ViewConstants::GanttSchemeUidProjectCenter
$newView.WVIEW_SPLITTER_POS = 250
#  Group by (see [pub].[MSP_WEB_GROUP_SCHEMES] table in Project DB for possible values)
$newView.WGROUP_SCHEME_UID = [Guid]::Empty

$viewRepDS.ViewReports.AddViewReportsRow($newView)
$svcPSProxy.UpdateView($viewRepDS)

July 22, 2015

Create Project Server Enterprise Custom Fields via PSI from PowerShell

Filed under: ALM, PowerShell, Project Server, PSI — Tags: , , , — Peter Holpar @ 22:38

Last year I already wrote about how one can manage the Project Server Enterprise Custom Fields via the Managed Client Object Modell. We could transfer the code samples of that post from C# to PowerShell, but because of the limitations of the Managed Client Object Modell I use the PSI interface instead in this case. What are those limitations? Not all of the properties available in PSI are exposed by the Client OM, see for example the MD_PROP_SUMM_GRAPHICAL_INDICATOR field, that we can use to set the rules of graphical indicators defined for the fields. I’ll show you an example for getting and setting the indicator rules in a later post, in the current one I only show you the technique we can use to create the Enterprise Custom Fields via PSI.

One can find an existing description with code sample in Step 3 and 4 of this post, that achieves the same goal, however, I don’t like that approach for several reasons, for example, because of  we have to generate the proxy assembly based on the WSDL in the code itself. Instead of that I find the following code much more simple:

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Project.Server.Library")

$pwaUrl = "http://YourProjectServer/pwa"

# create shortcuts
#
http://stackoverflow.com/a/1049010
$PSDataType = [Microsoft.Office.Project.Server.Library.PSDataType]
$Entities = [Microsoft.Office.Project.Server.Library.EntityCollection]::Entities

$svcPSProxy = New-WebServiceProxy -Namespace PSIProxy -Uri ($pwaUrl + "/_vti_bin/psi/CustomFields.asmx?wsdl") -UseDefaultCredential

$customFieldDataSet = New-Object PSIProxy.CustomFieldDataSet 

$customFieldRow = $customFieldDataSet.CustomFields.NewCustomFieldsRow()   
$customFieldRow.MD_PROP_UID = [Guid]::NewGuid()
$customFieldRow.MD_PROP_NAME = "Custom Project Field"
$customFieldRow.MD_PROP_TYPE_ENUM = $PSDataType::STRING
$customFieldRow.MD_ENT_TYPE_UID = $Entities.ProjectEntity.UniqueId
$customFieldRow.MD_PROP_IS_REQUIRED = $false
$customFieldRow.MD_PROP_IS_LEAF_NODE_ONLY = $false
$customFieldRow.MD_PROP_DESCRIPTION = "Test Field Desc."
$customFieldRow.SetMD_LOOKUP_TABLE_UIDNull()
$customFieldRow.SetMD_PROP_DEFAULT_VALUENull()
$customFieldDataSet.CustomFields.AddCustomFieldsRow($customFieldRow)

$svcPSProxy.CreateCustomFields($customFieldDataSet, $false, $true)

If you have casting issues when using the Namespace parameter of the New-WebServiceProxy cmdlet, you should read this post.

Creating a PowerShell-based Monitoring and Alerting System for Project Server

Filed under: PowerShell, Project Server — Tags: , — Peter Holpar @ 22:08

A few months ago I published a post about how to find the jobs in the Project Server queue programmatically. In the current post I will show you, how can you use PowerShell to track the number of jobs in queue, and send an e-mail alert, if the count is higher than a predefined limit for a longer period. Although the example in this post is Project Server specific, you can use the same technique to create other types of alerts as well.

Since the PowerShell script will be run by Windows Task Scheduler (for example on a 5-minute schedule) it was an important question, how to solve the communication between the runs. For example, how the current session can find out, since when the counter is higher than the limit? Of course, if the limit is reached, and we have already sent a mail, we would not like to send further mails for every and each runs till the counter is higher than the limit. But how to inform the forthcoming sessions from the current session, that we have sent a mail? Of course, there are many possible solutions for this problem. We could use a database, or a file (either XML or any custom format) to persist the information between the sessions. I’ve chosen an even simpler approach. I’ve create empty files (QueueLimitReached.txt and MailSent.txt), and check their existence and / or creation date to check when the limit has been reached and if the alert mail has been already sent. If the counter goes below the limit again, I simply delete these semaphore files.

Having this background, the script itself should be already straightforward.

  1. Add-PSSnapin "Microsoft.SharePoint.PowerShell"
  2.  
  3. $folderPath = "D:\ScheduledTasks\"
  4. $limitReachedFileName = "QueueLimitReached.txt"
  5. $mailSentFileName = "MailSent.txt"
  6. $ageOfFileLimit = 15 # in minutes
  7. $counterValueLimit = 50
  8.  
  9. $emailTo = "admins@company.com"
  10. $emailCc = "helpdesk@company.com;projmans@company.com"
  11. $emailSubject = "Project Server Queue Alert"
  12. $emailBody = @"
  13. Hi,
  14.  
  15. the count of the jobs in the Project Server Queue is very high. Please, fix the issue!
  16.  
  17. Regards,
  18. The PowerShell Monitor
  19.   "@
  20.  
  21. $limitReachedFilePath = $folderPath + $limitReachedFileName
  22. $mailSentFilePath = $folderPath + $mailSentFileName
  23.  
  24. function HasAlertState()
  25. {
  26.   $counter = Get-Counter -Counter "\ProjectServer:QueueGeneral(_Total)\Current Unprocessed Jobs"
  27.   $counterValue = $counter.CounterSamples[0].CookedValue
  28.   return ($counterValue -gt $counterValueLimit)
  29. }
  30.  
  31. function SendAlert()
  32. {   
  33.   $globalAdmin = New-Object Microsoft.SharePoint.Administration.SPGlobalAdmin
  34.  
  35.   $smtpMail = New-Object Net.Mail.MailMessage
  36.   $smtpMail.From = $globalAdmin.MailFromAddress
  37.   $smtpMail.Subject = $emailSubject
  38.   $smtpMail.Body = $emailBody
  39.   $emailTo.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.To.Add($mailAddr) }
  40.   $emailCc.Split(";") | % { $mailAddr = New-Object Net.Mail.MailAddress($_); $smtpMail.Cc.Add($mailAddr) }
  41.   $smtpMail.ReplyTo = New-Object Net.Mail.MailAddress($globalAdmin.MailReplyToAddress)
  42.   $smtpMail.BodyEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  43.   $smtpMail.SubjectEncoding = [System.Text.Encoding]::GetEncoding($globalAdmin.MailCodePage)
  44.  
  45.   $smtpClient = New-Object Net.Mail.SmtpClient($globalAdmin.OutboundSmtpServer)
  46.   $smtpClient.Send($smtpMail)
  47. }
  48.  
  49. $alertCondition = HasAlertState
  50.  
  51. If ($alertCondition)
  52. {
  53.   If (Test-Path $limitReachedFilePath)
  54.   {
  55.     $creationTime = (Get-ChildItem $limitReachedFilePath).CreationTime
  56.     $ageOfFile = ([DateTime]::Now – $creationTime).Minutes
  57.     Write-Host $ageOfFile
  58.     If ($ageOfFile -gt $ageOfFileLimit)
  59.     {
  60.       Write-Host Limit reached
  61.       If (-not (Test-Path $mailSentFilePath))
  62.       {
  63.         Write-Host Mail has not yet been sent. Send it now.
  64.         SendAlert
  65.         # suppress return value via casting it to null
  66.         [void] (New-Item -name $mailSentFileName -path $folderPath -itemType File)
  67.       }
  68.     }
  69.   }
  70.   # create a new file, if no former one exists
  71.   else
  72.   {
  73.     If (-not (Test-Path $limitReachedFilePath))
  74.     {
  75.       # suppress return value via casting it to null
  76.       [void] (New-Item -name $limitReachedFileName -path $folderPath -itemType File)
  77.     }
  78.   }
  79. }
  80. # delete the former files, if they exist
  81. Else
  82. {
  83.   If (Test-Path $limitReachedFilePath)
  84.   {
  85.     Remove-Item $limitReachedFilePath
  86.   }
  87.   If (Test-Path $mailSentFilePath)
  88.   {
  89.     Remove-Item $mailSentFilePath
  90.   }
  91. }

In the sample we check the value of the Current Unprocessed Jobs counter of Project Server. You can easily change the limit of  job count (50), and the time period (15 minutes) in the code, or customize the addressees, subject and body of the mail. If you would like to create other types of alerts, you should simply implement your own version of the HasAlertState method.

July 19, 2015

How to restrict the available users in a ‘Person or Group’ field to Project Server resources?

Filed under: Managed Client OM, PowerShell, PS 2013, SP 2013 — Tags: , , , — Peter Holpar @ 21:33

Assume you have a task list in your Project Web Access (PWA) site or on one of the Project Web Sites (PWS) in your Project Server and you would like to restrict the users available in the Assigned To field (field type of  ‘Person or Group‘) to users who are specified as Enterprise Resource in Project Server, that is running in the “classical” Project Server permission mode, and not in the new SharePoint Server permission mode. There is no synchronization configured between Active Directory groups and Project Server resources.

You can limit a ‘Person or Group‘ field to a specific SharePoint group, but there is no built-in solution to sync enterprise resources to SharePoint groups. In this post I show you, how to achieve that via PowerShell and the managed client object models of Project Server and SharePoint.

Note: You could get the login names of users assigned to the enterprise resources via REST as well (http://YourProjectServer/PWA/_api/ProjectServer/EnterpriseResources?$expand=User&$select=User/LoginName), but in my sample I still use the client object model of  Project Server.

My goal was to create a PowerShell solution, because it makes it easy to change the code on the server without any kind of compiler. I first created a C# solution, because the language elements of C# (like extension methods, generics and LINQ) help us to write compact, effective and readable code. For example, since the language elements of PowerShell do not support the LINQ expressions, you cannot simply restrict the elements and their properties returned by a client object model request, as I illustrated my former posts here, here and here. Having the working C# source code, I included it in my PowerShell script as literal string and built the .NET application at runtime, just as I illustrated in this post. In the C# code I utilized an extension method to help automated batching of the client object model request. More about this solution can be read here.

The logic of the synchronization is simple: we read the list of all non-generic enterprise resources, and store the login names (it the user exists) as string in a generic list. Then read the members of the SharePoint group we are synchronizing and store their login names as string in another generic list. Finally, we add the missing users to the SharePoint group and remove the extra users from the group.

The final code is included here:

  1. $pwaUrl = "http://YourProjectServer/PWA";
  2. $grpName = "AllResourcesGroup";
  3.  
  4. $referencedAssemblies = (
  5.     "Microsoft.SharePoint.Client, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  6.     "Microsoft.SharePoint.Client.Runtime, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  7.     "Microsoft.ProjectServer.Client, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c",
  8.     "System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089")
  9.  
  10. $sourceCode = @"
  11. using System;
  12. using System.Linq;
  13. using System.Collections.Generic;
  14. using Microsoft.SharePoint.Client;
  15. using Microsoft.ProjectServer.Client;
  16.  
  17. public static class Extensions
  18. {
  19.     // batching to avoid
  20.     // Microsoft.SharePoint.Client.ServerException: The request uses too many resources
  21.     // https://msdn.microsoft.com/en-us/library/office/jj163082.aspx
  22.     public static void ExecuteQueryBatch<T>(this ClientContext clientContext, IEnumerable<T> itemsToProcess, Action<T> action, int batchSize)
  23.     {
  24.         var counter = 1;
  25.  
  26.         foreach (var itemToProcess in itemsToProcess)
  27.         {
  28.             action(itemToProcess);
  29.             counter++;
  30.  
  31.             if (counter > batchSize)
  32.             {
  33.                 clientContext.ExecuteQuery();
  34.                 counter = 1;
  35.             }
  36.         }
  37.  
  38.         if (counter > 1)
  39.         {
  40.             clientContext.ExecuteQuery();
  41.         }
  42.     }
  43. }
  44.  
  45. public static class Helper
  46. {
  47.     public static void SyncGroupMembers(string pwaUrl, string grpName)
  48.     {
  49.         List<string> resLogins = new List<string>();
  50.         List<string> grpLogins = new List<string>();
  51.         var batchSize = 20;
  52.  
  53.         using (var projectContext = new ProjectContext(pwaUrl))
  54.         {
  55.             var resources = projectContext.EnterpriseResources;
  56.             projectContext.Load(resources, rs => rs.Where(r => !r.IsGeneric).Include(r => r.User.LoginName));
  57.  
  58.             projectContext.ExecuteQuery();
  59.  
  60.             resLogins.AddRange(resources.ToList().Where(r => r.User.ServerObjectIsNull == false).ToList().Select(r => r.User.LoginName.ToLower()));               
  61.         }
  62.         using (var clientContext = new ClientContext(pwaUrl))
  63.         {
  64.             var web = clientContext.Web;
  65.  
  66.             var grp = web.SiteGroups.GetByName(grpName);
  67.             clientContext.Load(grp, g => g.Users.Include(u => u.LoginName));
  68.  
  69.             clientContext.ExecuteQuery();
  70.  
  71.             grpLogins.AddRange(grp.Users.ToList().ToList().Select(u => u.LoginName.ToLower()));
  72.  
  73.             var usersToAdd = resLogins.Where(l => !grpLogins.Contains(l));
  74.             clientContext.ExecuteQueryBatch<string>(usersToAdd,
  75.                 new Action<string>(loginName =>
  76.                 {
  77.                     var user = web.EnsureUser(loginName);
  78.                     grp.Users.AddUser(user);
  79.                 }),
  80.                 batchSize);
  81.  
  82.             var usersToRemove = grpLogins.Where(l => !resLogins.Contains(l));
  83.             clientContext.ExecuteQueryBatch<string>(usersToRemove,
  84.                 new Action<string>(loginName =>
  85.                 {
  86.                     grp.Users.RemoveByLoginName(loginName);
  87.                 }),
  88.                 batchSize);
  89.         }
  90.     }
  91. }
  92.  
  93. "@
  94. Add-Type -ReferencedAssemblies $referencedAssemblies -TypeDefinition $sourceCode -Language CSharp;
  95.  
  96. [Helper]::SyncGroupMembers($pwaUrl, $grpName)

July 16, 2015

How to process the output of the Test-SPContentDatabase cmdlet

Filed under: PowerShell, Regular expressions, SP 2013 — Tags: , , — Peter Holpar @ 22:07

The Test-SPContentDatabase PowerShell cmdlet is a handy tool if you have to check the healthiness of your web application, for example, if you are preparing an upgrade, or migrating content DBs between servers.

Since its output can be pretty long, its typical usage is to redirect the output to a text file, like:

Test-SPContentDatabase -Name YourContentDB -WebApplication http://YourWebApp > F:\data\pholpar\TestSPContentDB.txt

The problem with this approach is, that the output contains a textural representation of the result. You can browse through the text, but it is problematic, if the output is huge. Furthermore, it is not suitable for direct automation, for example, if you would like to create some kind of report or statistic, or to feed a tool that should fix the issues.

I’ve tried to find the source of the information displayed by the tool in the object model, and found that the text is assembled in the components very deep in the object hierarchy, just above the data access layer, so we can not achieve it via Reflection (for example by calling private methods of internal classes) as well. If you don’t want to tamper wit the database (that is generally not recommended by Microsoft, to say the least), you should find another way.

You can parse out for example the values you need from the text using PowerShell and regular expressions. In my post below I provide you with the scripts you can use to get the information for the four main types of errors retuned by Test-SPContentDatabase: MissingFeature, MissingSetupFile, MissingAssembly and MissingWebPart.

First, instead of the redirection the output to a text file, we store all of the items returned by the Test-SPContentDatabase cmdlet (each item is of type SPContentDatabaseTestResult) in a variable:

$problems = Test-SPContentDatabase -Name YourContentDB -WebApplication http://YourWebApp

The items of category MissingFeature have the following properties.

Category        : MissingFeature
Error           : True
UpgradeBlocking : False
Message         : Database [YourContentDB] has reference(s) to a missing f
                  eature: Id = [75fa102a-b85d-4e7d-bd7f-00101b98e11e].
Remedy          : The feature with Id 75fa102a-b85d-4e7d-bd7f-00101b98e11e is r
                  eferenced in the database [YourContentDB], but is not in
                  stalled on the current farm. The missing feature may cause up
                  grade to fail. Please install any solution which contains the
                   feature and restart upgrade if necessary.

We can use the Message property to parse out the name of the content database and the Id of the feature.

$problems | ? { $_.Category -eq "MissingFeature" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "Database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Feature -Value ($_.Message | Select-String -Pattern "missing feature: Id = \[(?<feature>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["feature"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nFeature: $_.Feature`r`n`r`n }

Similarly, the items of category MissingSetupFile have this format:

Category        : MissingSetupFile
Error           : True
UpgradeBlocking : False
Message         : File [Features\PublishingLayouts\Images\gr_logo.jpg] is refer
                  enced [1] times in the database [YourContentDB], but is
                  not installed on the current farm. Please install any feature
                  /solution which contains this file.
Remedy          : One or more setup files are referenced in the database [YourC
                  ontentDB], but are not installed on the current farm. Please
                  install any feature or solution which contains these files.

The script for this type of items:

$problems | ? { $_.Category -eq "MissingSetupFile" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name RefCount -Value ($_.Message | Select-String -Pattern "is referenced \[(?<refcount>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["refcount"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name File -Value ($_.Message | Select-String -Pattern "File \[(?<filename>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["filename"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nReference count: $_.RefCount`r`nFile: $_.File`r`n`r`n }

The items of category MissingAssembly have this format:

Category        : MissingAssembly
Error           : True
UpgradeBlocking : False
Message         : Assembly [YourComp.Custom.EventReceiver, Version=1.0.0.0,
                   Culture=neutral, PublicKeyToken=8ffc7db3dc6d3b89] is referen
                  ced in the database [YourContentDB], but is not installe
                  d on the current farm. Please install any feature/solution wh
                  ich contains this assembly.
Remedy          : One or more assemblies are referenced in the database [YourCo
                  ntentDB], but are not installed on the current farm. Please i
                  nstall any feature or solution which contains these asse
                  mblies.

The script for this type of items:

$problems | ? { $_.Category -eq "MissingAssembly" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Assembly -Value ($_.Message | Select-String -Pattern "Assembly \[(?<assembly>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["assembly"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nAssembly: $_.Assembly`r`n`r`n }

Finally, he items of category MissingWebPart have the format:

Category        : MissingWebPart
Error           : True
UpgradeBlocking : False
Message         : WebPart class [9c7d980e-6470-94d9-db54-419b27d03880] is refer
                  enced [3] times in the database [YourContentDB], but is
                  not installed on the current farm. Please install any feature
                  /solution which contains this web part.
Remedy          : One or more web parts are referenced in the database [YourCon
                  tentDB], but are not installed on the current farm. Please in
                  stall any feature or solution which contains these web parts.

And the important attributes of these items can be parsed out via this script:

$problems | ? { $_.Category -eq "MissingWebPart" } | % {
  $obj = New-Object PSObject
  Add-Member -InputObject $obj -MemberType NoteProperty -Name ContentDb -Value ($_.Message | Select-String -Pattern "in the database \[(?<db>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["db"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name RefCount -Value ($_.Message | Select-String -Pattern "is referenced \[(?<refcount>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["refcount"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Id -Value ($_.Message | Select-String -Pattern "WebPart class \[(?<id>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["id"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Class -Value ($_.Message | Select-String -Pattern " \(class \[(?<class>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["class"].Value })
  Add-Member -InputObject $obj -MemberType NoteProperty -Name Assembly -Value ($_.Message | Select-String -Pattern "from assembly \[(?<assembly>[^\[]*)\]" | Select -Expand Matches | % {$_.Groups["assembly"].Value })
  return $obj
} | % { Write-Host ContentDb: $_.ContentDb`r`nReference count: $_.RefCount`r`nId: $_.Id`r`nClass: $_.Class`r`nAssembly: $_.Assembly`r`n`r`n }

July 9, 2015

Creating custom SharePoint permission levels via PowerShell

Filed under: Permissions, PowerShell, Security, SP 2010 — Tags: , , , — Peter Holpar @ 15:36

Today I had to create some custom permissions via code, however the best post I found in this theme included only C# code, and I was to use PowerShell. Although it is not very difficult to translate the code, it is not trivial as well. So I thought I share my “translation”, and hope somebody find it useful.

Copy a permission level:
$contributorRole = $web.RoleDefinitions.GetByType([Microsoft.SharePoint.SPRoleType]::Contributor)
$customRole = New-Object Microsoft.SharePoint.SPRoleDefinition($contributorRole)

Add a permission:
$customRole.BasePermissions = $customRole.BasePermissions -bor [Microsoft.SharePoint.SPBasePermissions]::CreateGroups

Add multiple permissions:
$customRole.BasePermissions = $customRole.BasePermissions -bor ([Microsoft.SharePoint.SPBasePermissions]::ApplyStyleSheets -bor [Microsoft.SharePoint.SPBasePermissions]::ApproveItems)

Add all permissions:
$customRole.BasePermissions = $customRole.BasePermissions -bor [Microsoft.SharePoint.SPBasePermissions]::FullMask

Remove a permission:
$customRole.BasePermissions = $customRole.BasePermissions -band -bnot [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems

Remove multiple permissions:
$customRole.BasePermissions = $customRole.BasePermissions -band -bnot ([Microsoft.SharePoint.SPBasePermissions]::DeleteVersions -bor [Microsoft.SharePoint.SPBasePermissions]::EditListItems)

Remove all permissions:
$customRole.BasePermissions = $customRole.BasePermissions -band [Microsoft.SharePoint.SPBasePermissions]::EmptyMask

Test for a permission:
$permissionTest = ($customRole.BasePermissions -band [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems) -eq [Microsoft.SharePoint.SPBasePermissions]::DeleteListItems

Save a permission level:
$customRole.Name = "Your custom permission"
$customRole.Description = "Description of your custom permission level"
$web.RoleDefinitions.Add($customRole)
$web.Update()

April 8, 2015

Automating the Deployment of a Customized Project Web Site Template via PowerShell and the Managed Client Object Model

Filed under: ALM, Managed Client OM, PowerShell, Project Server — Tags: , , , — Peter Holpar @ 21:45

Assume, you have created a customized web site template for your enterprise project type in the development environment as described here, and now you would like to deploy it into the test farm. Of course, you can manually delete the former site template, upload the new one, re-configure it to be the associated web site template for your enterprise project type, and finally re-create your test project (that means, checking in and deleting the existing one, and create it again using the new template), but this procedure is boring, cumbersome and – as any human-based process – rather error-prone.

Why do not automate this step as well?

I’ve created a PowerShell script that performs the steps outlined above. The first steps (deleting the former version of the site template and uploading the new one) can be done by native PowerShell Cmdlets, but for the remaining, Project Server related tasks require the Managed Client Object Model, so we import the necessary assemblies into the process.

First we get a list of all projects and a list of all enterprise project types, then query for the right ones on the “client side”.

Note: Although PowerShell does not support .NET extension methods (like the Where and Include methods of the client object model) natively, we could restrict the items returned by these queries to include really only the item we need (see solution here), and include only the properties we need (as described here). As the item count of the projects and enterprise project types is not significant, and we should use the script on the server itself due to the SharePoint Cmdlets, it has no sense in this case to limit the network traffic via these tricks.

Next, we update the web site template setting (WorkspaceTemplateName  property) of the enterprise project type. We need this step as the original vale was reset to the default value as we deleted the original site template on re-upload.

If the test project is found, we delete it (after we checked it in, if it was checked out), and create it using the updated template.

Since these last steps (project check-in, deletion, and creation) are all queue-based operations, we should use the WaitForQueue method to be sure the former operation is completed before we start the next step.

$pwaUrl = "http://YourProjectServer/PWA/&quot;
$solutionName = "YourSiteTemplate"
$wspFileName = $solutionName + ".wsp"
$timeoutSeconds = 1000
$projName = "TestProj"

# English
$projType = "Enterprise Project"
$pwaLcid = 1033
# German
#$projType = "Enterprise-Projekt"
#$pwaLcid = 1031

# path of the folder containing the .wsp
$localRootPath = "D:\SiteTemplates\"
$wspLocalPath = $localRootPath + $wspFileName

# uninstall / remove the site template if activated / found
$solution = Get-SPUserSolution -Identity $wspFileName -Site $pwaUrl -ErrorAction SilentlyContinue
If ($solution -ne $Null) {
  If ($solution.Status -eq "Activated") {
    Write-Host Uninstalling web site template
    Uninstall-SPUserSolution -Identity $solutionName -Site $pwaUrl -Confirm:$False
  }
  Write-Host Removing web site template
  Remove-SPUserSolution -Identity $wspFileName -Site $pwaUrl -Confirm:$False
}

# upload and activate the new version
Write-Host Uploading new web site template
Add-SPUserSolution -LiteralPath $wspLocalPath -Site $pwaUrl
Write-Host Installing new web site template
$dummy = Install-SPUserSolution -Identity $solutionName -Site $pwaUrl
 
# set the path according the location of the assemblies
Add-Type -Path "c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.ProjectServer.Client.dll"
Add-Type -Path "c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"

$projectContext = New-Object Microsoft.ProjectServer.Client.ProjectContext($pwaUrl)

# get lists of enterprise project types and projects
$projectTypes = $projectContext.LoadQuery($projectContext.EnterpriseProjectTypes)
$projects = $projectContext.Projects
$projectList = $projectContext.LoadQuery($projectContext.Projects)

$projectContext.ExecuteQuery()

$entProjType = $projectTypes | ? { $_.Name -eq $projType }
$project = $projectList | ? { $_.Name -eq $projName }

Write-Host Updating web site template for the enterprise project type
$web = Get-SPWeb $pwaUrl
$template = $web.GetAvailableWebTemplates($pwaLcid) | ? { $_.Title -eq $solutionName }

$entProjType.WorkspaceTemplateName = $template.Name
$projectContext.EnterpriseProjectTypes.Update()
$projectContext.ExecuteQuery()

If ($project -ne $Null) {
  If ($project.IsCheckedOut) {
    Write-Host Project $projName is checked out, checking it in before deletion
    $checkInJob = $project.Draft.CheckIn($True)
    $checkInJobState = $projectContext.WaitForQueue($checkInJob, $timeoutSeconds)
    Write-Host Check-in project job status: $checkInJobState
  }
  Write-Host Deleting existing project $projName
  # we can delete the project either this way
  #$removeProjResult = $projects.Remove($project)
  #$removeJob = $projects.Update()
  # or
  $removeJob = $project.DeleteObject()
  $removeJobState = $projectContext.WaitForQueue($removeJob, $timeoutSeconds)
  Write-Host Remove project job status: $removeJobState
}

I found the set of Project Server PowerShell Cmdlets is limited, and rather operation-based. You can use it, as long as your single task is to administer Project Server instances and databases. However, when it comes to the interaction with Project Server entities, you have to involve the Managed Client Object Model. Hopefully this example provides not only a reusable tool, but also helps you understand, how to extend your own PowerShell library with the methods borrowed from the client side .NET libraries.

Automating Project Server development tasks via PowerShell and the Client Object Model – Customizing Project Web Site templates

Filed under: ALM, Managed Client OM, PowerShell, Project Server — Tags: , , , — Peter Holpar @ 21:35

I start with a note this time: Even though you were not interested in Project Server itself at all, I suggest you to read the post further, while most of the issues discussed below are not Project Server specific, they apply to SharePoint as well.

Recently I work mostly on a Project Server customization project. As I’ve learned on my former development projects, I try to automate so much repetitive tasks as possible (like automating the PWA provisioning), thus remains more time for the really interesting stuff. I plan to post my results on this blog to share the scripts and document the experiences for myself as well.

One of the very first tasks (and probably a never-ending one) was to create a customized Project Web Site (PWS) site template. New Enterprise Projects created in the Project Web Access (PWA) should have their PWS created based on the custom site template.

The process of customizing a PWS site template is described in this post, however, there are a few issues if we apply this approach alone, just to name a few of them:

– PWS master pages cannot be edited using SharePoint Designer by default. There is a workaround for this issue.

– If I create a custom master page for the PWA and would like a PWS to refer the same master page, I can set it for example using PowerShell. However, if I create a site template from this PWS, this configuration seems to be lost in the template, and template refers to the default seattle.master. I have similar experience with the site image / logo, I can set one, but this setting seems to be not saved in the template.

– The standard navigation structure of a project site (and all site template created based on it) contains project-specific navigation nodes, like Project Details that contains the Guid of the current project as a query string parameter. If you create a site template from this site, any project sites that will be created based on this template will contain this node twice: one of the is created based on the site template (wrong Guid, referring to the project the original site belongs to, thus wrong URL), and another one is created dynamically as the project web site gets provisioned.

The workflow of my web site template creation and customization process includes four main parts, and two of them – step 2 and step 4 – are automated by our script.

The first part of the process (including step 1 and step 2) is optional. If you have changed nothing in your web site prototype, you can immediately start with the manual manipulation of the extracted web site template content (step 3), otherwise, we have to get a fresh version of the template into our local system for the further customizations.

Step 1: Creation and customization a SharePoint web site, that serves as a prototype for the web site template.

A SharePoint web site is customized based on the requirements using the SharePoint web UI, SharePoint Designer (for PWA see this post), or via other tools, like PowerShell scripts (for example, JSLink settings). This is a “manual” task.

Step 2: Creation of the web site template based on the prototype, downloading and extracting the site template.

A site template is created (including content) based on the customized web site. If a former site template having the same name already exists, if will be deleted first.

The site template is downloaded to the local file system (former file having the same name is deleted first).

The content of the .wsp file (CAB format) is extracted into a local folder (folder having the same name is deleted first, if it exists).

Step 3: Customization of the extracted web site template artifacts.

The script is paused. In this step you have the chance to manual customization of the solution files, like ONet.xml.

Step 4: Compressing the customized files into a new site template, and uploading it to SharePoint.

After a key press the script runs further.

Files having the same name as our site template and extension of .cab or .wsp will be deleted. The content of the folder is compressed as .cab and the renamed to .wsp.

In the final step the original web site template is removed and the new version is installed.

Next, a few words about the CAB extraction and compression tools I chose for the automation. Minimal requirements were that the tool must have a command line interface and it should recognize the folder structure to be compressed automatically, without any helper files (like the DDF directive file in case of makecab).

After reading a few comparisons (like this and this one) about the alternative options, I first found IZArc and its command line add-on (including IZARCC for compression and IZARCE for extraction, see their user’s manual for details) to be the best choice. However after a short test I experienced issues with the depth of the folder path and file name length in case of IZARCE, so I fell back to extrac32 for the extraction.

Finally, the script itself:

$pwaUrl = "http://YourProjectServer/PWA/&quot;
$pwsSiteTemplateSourceUrl = $pwaUrl + "YourPrototypeWeb"
$solutionName = "YourSiteTemplate"
$wspFileName = $solutionName + ".wsp"
$cabFileName = $solutionName + ".cab"
$siteTemplateTitle = $solutionName
$siteTemplateName = $solutionName
$siteTemplateDescription = "PWS Website Template"

$localRootPath = "D:\SiteTemplates\"
$wspExtractFolderName = $solutionName
$wspExtractFolder = $localRootPath + $wspExtractFolderName
$wspFilePath = $localRootPath + $wspFileName
$wspLocalPath = $localRootPath + $wspFileName
$wspUrl = $pwaUrl + "_catalogs/solutions/" + $wspFileName

$cabFilePath = $localRootPath + $cabFileName

function Using-Culture (
   [System.Globalization.CultureInfo]   $culture = (throw "USAGE: Using-Culture -Culture culture -Script {…}"),
   [ScriptBlock]
   $script = (throw "USAGE: Using-Culture -Culture culture -Script {…}"))
   {
     $OldCulture = [Threading.Thread]::CurrentThread.CurrentCulture
     $OldUICulture = [Threading.Thread]::CurrentThread.CurrentUICulture
         try {
                 [Threading.Thread]::CurrentThread.CurrentCulture = $culture
                 [Threading.Thread]::CurrentThread.CurrentUICulture = $culture
                 Invoke-Command $script
         }
         finally {
                 [Threading.Thread]::CurrentThread.CurrentCulture = $OldCulture
                 [Threading.Thread]::CurrentThread.CurrentUICulture = $OldUICulture
         }
   }

function Remove-SiteTemplate-IfExists($solutionName, $wspFileName, $pwaUrl) 
{
  $us = Get-SPUserSolution -Identity $solutionName -Site $pwaUrl -ErrorAction SilentlyContinue
  if ($us -ne $Null)
  {
    Write-Host Former version of site template found on the server. It will be removed…
    Uninstall-SPUserSolution -Identity $solutionName -Site $pwaUrl -Confirm:$False
    Remove-SPUserSolution -Identity $wspFileName -Site $pwaUrl -Confirm:$False
  }
}

function Remove-File-IfExists($path)
{
  If (Test-Path $path)
  {
    If (Test-Path $path -PathType Container)
    {
      Write-Host Deleting folder: $path
      Remove-Item $path -Force -Recurse
    }
    Else
    {
      Write-Host Deleting file: $path
      Remove-Item $path -Force
    }
  }
}

Do { $downloadNewTemplate = Read-Host "Would you like to get a new local version of the site template to edit? (y/n)" }
Until ("y","n" -contains $downloadNewTemplate )

If ($downloadNewTemplate -eq "y")
{

    Remove-SiteTemplate-IfExists $solutionName $wspFileName $pwaUrl

    Using-Culture de-DE { 
     Write-Host Saving site as site template including content
     $web = Get-SPWeb $pwsSiteTemplateSourceUrl
     $web.SaveAsTemplate($siteTemplateName, $siteTemplateTitle, $siteTemplateDescription, 1)
   }

  Remove-File-IfExists $cabFilePath

  Write-Host Downloading site template
  $webClient = New-Object System.Net.WebClient
  $webClient.UseDefaultCredentials  = $True 
  $webClient.DownloadFile($wspUrl, $cabFilePath)

  # clean up former version before downloading the new one
  # be sure you do not lock the deletion, for example, by having one of the subfolders opened in File Explorer,
  # or via any file opened in an application
  Remove-File-IfExists $wspExtractFolder

  Write-Host Extracting site template into folder $wspExtractFolder
  #
http://updates.boot-land.net/052/Tools/IZArc%20MANUAL.TXT
  # limited file lenght / folder structure depth! :-(
  #& "C:\Program Files (x86)\IZArc\IZARCE.exe" -d $cabFilePath $wspExtractFolder

  #http://researchbin.blogspot.co.at/2012/05/making-and-extracting-cab-files-in.html
  #expand $cabFilePath $wspExtractFolder -F:*.*
  extrac32 /Y /E $cabFilePath /L $wspExtractFolder
}

Write-Host "Alter the extracted content of the site template, then press any key to upload the template…"
# wait any key press without any output to the console
#
http://technet.microsoft.com/en-us/library/ff730938.aspx
$dummy = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

# clean up former version before creating the new one
# TODO rename it using a date time pattern instead of deletion!
Remove-File-IfExists $cabFilePath
Remove-File-IfExists $wspFilePath

# makecab: we cannot include multiple files directly. To do that, we have to create a directive file called a Diamond Directive File(DDF) and include instructions in it
#
http://comptb.cects.com/automate-compression-tasks-cli/
& "C:\Program Files (x86)\IZArc\IZARCC.exe" -a -r -p $cabFilePath $wspExtractFolder

Rename-Item $cabFilePath $wspFileName

# remove former solution before uploading and activating the new one
Remove-SiteTemplate-IfExists $solutionName $wspFileName $pwaUrl

Write-Host Installing the new version of the site template
Add-SPUserSolution -LiteralPath $wspFilePath -Site $pwaUrl
$dummy = Install-SPUserSolution -Identity $solutionName -Site $pwaUrl

Note: If you are working with the English version of the PWA and have an English operating system on the server, you don’t need the Using-Culture function. To learn more about it see this post.

March 29, 2015

May Merge-SPLogFile Flood the Content Database of the Central Administration Web Application?

Filed under: Content database, PowerShell, SP 2013 — Tags: , , — Peter Holpar @ 00:20

In the recent weeks we searched for the cause of a specific error in one of our SharePoint 2013 farms. To get detailed trace information, we switched the log level often to the VerboseEx mode. A few days later the admins alerted us that the size of the Central Administration content database has been increased enormously (at that time it was about 80 GB!).

Looking for the source of this unexpected amount of data I found a document library called Diagnostics Log Files (Description: View and analyze merged log files from the all servers in the farm) that contained 1824 items.

image

image

Although I consider myself primarily a SharePoint developer, I always try to remain up-to-date in infrastructural themes as well, but to tell the truth I’ve never seen this document library before. Searching the web didn’t help as well.

Having a look into the document library I found a set of folders with GUID names.

image

Each of the folders contained a lot of files: a numbered set for each of the servers in the farm.

image

The files within the folders are archived logs in .gz file format, each around 26 MB.

image

Based on the description of the library I guessed that it has a relation to the Merge-SPLogFile cmdlet, that performs collection of ULS log files from all of the servers in the farm an saves the aggregation in the specified file on the local system, although I have not found any documentation how it performs this action and if it has anything to do with the content DB of the central admin site.

 

After a few hours of “reflectioning”, it was obvious, how this situation was achieved. If you are not interested in call chains, feel free to jump through the following part.

All of the classes and methods below are defined in the Microsoft.SharePoint assembly, if not specified otherwise.

The InternalProcessRecord method of the Microsoft.SharePoint.PowerShell.SPCmdletMergeLogFile class (Microsoft.SharePoint.PowerShell assembly) creates a new instance of the SPMergeLogFileUtil class based on the path of the aggregated log folder and the filter expression, and calls its Run method:

SPMergeLogFileUtil util = new SPMergeLogFileUtil(this.Path, filter);
ThreadPool.QueueUserWorkItem(new WaitCallback(util.Run));

In the Run method of the Microsoft.SharePoint.Diagnostics.SPMergeLogFileUtil class:

public void Run(object stateInfo)
{
  try
  {
    this.Progress = 0;
    // Creates the diagnostic log files document library in central admin (if does not yet exist) via the GetLogFilesList method
    // Add a new subfolder (having GUID name) to the library. If the folder already exists, deletes its content.
    // Executes child jobs (SPMergeLogFilesJobDefinition) on each farm member, see more details about it later below
    List<string> jobs = this.DispatchCollectingJobs();
    // Waits for all child jobs to be finished on the farm members
    this.MonitorJobs(jobs);
    // Merges the content of the collected files from the central admin document library into the specified local file system folder by calling the MergeFiles method
    // Finally deletes the temporary files one by one from the central admin document library and at the very end deletes their folder as well
    this.RunMerge();
  }
  catch (Exception exception)
  {
    this.Error = exception;
  }
  finally
  {
    this.Progress = 100;
  }
}

Yes, as you can see, if there is any error in the file collection process on any of the farm members, or the merging process fails, the files won’t be deleted from the Diagnostics Log Files document library.

Instead of the RunMerge method, the deletion process would have probably a better place in the finally block, or at least, in the catch block one should check if the files were removed successfully.

 

A few words about the Microsoft.SharePoint.Diagnostics.SPMergeLogFilesJobDefinition, as promised earlier. Its Execute method calls the CollectLogFiles method that creates an ULSLogFileProcessor instance based on the requested log filter and gets the corresponding ULS entries from the farm member the job running on, stores the entries in a temporary file in the file system, uploads them to the actual subfolder (GUID name) of the Diagnostics Log Files document library (file name pattern: [ServerLocalName] (1).log.gz or [ServerLocalName].log.gz if a single file is enough to store the log entries from the server), and delete the local temporary file.

 

A few related text values can we read via PowerShell as well:

In the getter of the DisplayName property in the SPMergeLogFilesJobDefinition class returns:

SPResource.GetString("CollectLogsJobTitle", new object[] { base.Server.DisplayName });

reading the value via PowerShell:

[Microsoft.SharePoint.SPResource]::GetString("CollectLogsJobTitle")

returns

Collection of log files from server |0

In the GetLogFilesList method of the SPMergeLogFileUtil class we find the title and description of the central admin document library used for log merging

string title = SPResource.GetString(SPGlobal.ServerCulture, "DiagnosticsLogsTitle", new object[0]);
string description = SPResource.GetString(SPGlobal.ServerCulture, "DiagnosticsLogsDescription", new object[0]);

Reading the values via the PowerShell:

[Microsoft.SharePoint.SPResource]::GetString("DiagnosticsLogsTitle")

returns

Diagnostics Log Files

[Microsoft.SharePoint.SPResource]::GetString("DiagnosticsLogsDescription")

returns

View and analyze merged log files from the all servers in the farm

These values supports our theory, that the library was created and filled by these methods.

 

Next, I’ve used the following PowerShell script to look for the failed merge jobs in the time range when the files in the Central Administration have been created:

$farm = Get-SPFarm
$from = "3/21/2015 12:00:00 AM"
$to = "3/21/2015 6:00:00 AM"
$farm.TimerService.JobHistoryEntries | ? {($_.StartTime -gt $from) -and ($_.StartTime -lt $to) -and ($_.Status -eq "Failed") -and ($_.JobDefinitionTitle -like "Collection of log files from server *")}

The result:

image

As you can see, in our case there were errors on both farm member servers, for example, the storage space was not enough on one of them. After the jobs have failed, the aggregated files were not deleted from the Diagnostics Log Files document library of the Central Administration.

Since even a successful execution of the Merge-SPLogFile cmdlet can temporarily increase the size of the Central Administration content database considerably, and the effect of the failed executions is not only temporary (and particularly large, if it happens several times and is combined with verbose logging), SharePoint administrators should be aware of these facts, consider them when planning database sizes and / or take an extra maintenance step to remove the rest of failed merge processes from the Diagnostics Log Files document library regularly. As far as I see, the issue hits SharePoint 2010 and 2013 as well.

March 26, 2015

Strange Localization Issue When Working with List and Site Templates

Filed under: MUI, PowerShell, SP 2013 — Tags: , — Peter Holpar @ 23:44

One of our clients runs a localized version of SharePoint 2013. The operating system is a Windows 2012 R2 Server (English), the SharePoint Server itself is English as well. The German language pack is installed and sites and site collections were created in German. We are working with various custom site templates. Recently one of these templates had to be extended with a Task-list based custom lists (called ToDos). The users prepared the list in a test site, and we saved the list as a template. We created a new site using the site template (we will refer to this site later as prototype), and next we created a new list based on the custom list template. Finally, we saved the altered web site as site template, including content using the following PowerShell commands:

$web = Get-SPWeb $siteTemplateSourceUrl
$web.SaveAsTemplate($siteTemplateName, $siteTemplateTitle, $siteTemplateDescription, 1)

We created a test site using the new site template, and everything seemed to be OK. However, after a while, the users started to complain, that a menu for the new list contains some English text as well. As it turned out, some of the views for the new list were created with English title:

image

Problem2

First, we verified the manifest.xml of the list template, by downloading the .stp file (that has a CAB file format) and opening it using IZArc. We found, that the DisplayName property of the default view (“Alle Vorgänge” meaning “All Tasks”) and a custom datasheet view (called “db”, stands for “Datenblatt”) contains the title as text, the DisplayName property of the other views contains a resource reference (like “$Resources:core,Late_Tasks;”).

ListTemplate

Next, we downloaded the site template (the .wsp file has also a CAB file format, and can be opened by IZArc), and verified the schema.xml for the ToDos list. We found, that original, German texts (“Alle Vorgänge” and “db”) were kept, however, all other view names were “localized” to English.

English

At this point I guessed already, that problem was caused by the local of the thread the site template exporting code was run in. To verify my assumption, I saved the prototype site from the site settings via the SharePoint web UI (that is German in our case). This time the resulting schema.xml in the new site template .wsp contained the German titles:

German

We got the same result (I mean German view titles) if we called our former PowerShell code by specifying German as the culture for the running thread. See more info about the Using-Culture helper method, SharePoint Multilingual User Interface (MUI) and PowerShell here:

Using-Culture de-DE 
  $web = Get-SPWeb $pwsSiteTemplateSourceUrl
  $web.SaveAsTemplate($siteTemplateName, $siteTemplateTitle, $siteTemplateDescription, 1)
}

We’ve fixed the existing views via the following PowerShell code (Note: Using the Using-Culture helper method is important in this case as well. We have only a single level of site hierarchy in this case, so there is no recursion in code!):

$web = Get-SPWeb http://SharePointServer

function Rename-View-IfExists($list, $viewNameOld, $viewNameNew)
{
  $view =  $list.Views[$viewNameOld]
  If ($view -ne $Null) {
      Write-Host Renaming view $viewNameOld to $viewNameNew
      $view.Title = $viewNameNew
      $view.Update()
  }
  Else {
    Write-Host View $viewNameOld not found
  }
}

Using-Culture de-DE {
  $web.Webs | % {
    $list = $_.Lists["ToDos"]
    If ($list -ne $Null) {
      Write-Host ToDo list found in $_.Title
      Rename-View-IfExists $list "Late Tasks" "Verspätete Vorgänge"
      Rename-View-IfExists $list "Upcoming" "Anstehend"
      Rename-View-IfExists $list "Completed" "Abgeschlossen"
      Rename-View-IfExists $list "My Tasks" "Meine Aufgaben"
      Rename-View-IfExists $list "Gantt Chart" "Gantt-Diagramm"
      Rename-View-IfExists $list "Late Tasks" "Verspätete Vorgänge" 
    }
  }
}

Strange, that we had no problem with field names or other localizable texts when worked with the English culture.

Older Posts »

The Shocking Blue Green Theme. Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 57 other followers