Second Life of a Hungarian SharePoint Geek

February 27, 2018

Paging through the Entities returned by the ProjectData OData Interface of Project Server using PowerShell

Filed under: OData, PowerShell, Project Server, REST — Tags: , , , — Peter Holpar @ 21:13

About a year ago I wrote about how you can use PowerShell and Project Server REST interfaces (ProjectServer and ProjectData) to generate reports. However, at the end of last year we had a problem with the script that query the projects via the ProjectData interface:

  1. $url = 'http://YourProjectServer/PWA/_api/ProjectData/%5Ben-US%5D/Projects?$select=ProjectId,ProjectName,ProjectCreatedDate'
  2. $reportPath = "C:\Data\ProjServerReports\"
  4. $request = [System.Net.WebRequest]::Create($url)
  5. $request.UseDefaultCredentials = $true
  6. $request.Accept = "application/json;odata=verbose"
  8. $response = $request.GetResponse()
  9. $reader = New-Object System.IO.StreamReader $response.GetResponseStream()
  10. $data = $reader.ReadToEnd()
  12. $result = ConvertFrom-Json -InputObject $data
  13. $result.d.results | select ProjectId, ProjectName, ProjectCreatedDate | Export-Csv -Path ($reportPath + 'ProjectsFromProjectDataNoBatching.csv') -Delimiter ";" -Encoding UTF8 -NoTypeInformation

Important! You should use single quotes around the URL in this code snippet and in the other ones below to avoid PowerShell to remove the string $select in the REST query. More about that here.

A user reported an error that some of the projects were not included in the results. We found that the count of projects included in the report via ProjectServer was greater than the count of the projects we queried via  ProjectData. It is the script version that use the ProjectServer interface:

  1. $url = 'http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,CreatedDate'
  2. $reportPath = "C:\Data\ProjServerReports\"
  4. $request = [System.Net.WebRequest]::Create($url)
  5. $request.UseDefaultCredentials = $true
  6. $request.Accept = "application/json;odata=verbose"
  8. $response = $request.GetResponse()
  9. $reader = New-Object System.IO.StreamReader $response.GetResponseStream()
  10. $data = $reader.ReadToEnd()
  12. $result = ConvertFrom-Json -InputObject $data
  13. $result.d.results | select Id, Name, CreatedDate | Export-Csv -Path ($reportPath + 'ProjectsFromProjectServer.csv') -Delimiter ";" -Encoding UTF8 -NoTypeInformation

First I thought it is an issue with some permissions, but it was suspicious, that all of the missing projects were at the end of the project list (when ordered alphabetically), and the number of included project was exactly 100 in the ProjectData-based report.

Then I found this statement in the ProjectData – Project OData service reference:

“There are limits to the number of entities that can be returned in one query of the ProjectData service.”

What a surprise, the default limit for projects in the on-premise version we have is exactly 100!

You can use the Set-SPProjectOdataConfiguration PowerShell cmdlet for on-premise instances of Project Server to override the default query page size limits for any specified entity set.

Set-SPProjectOdataConfiguration -EntitySetName Projects -PageSizeOverride 200

However, it is a global change on the server, that affects all queries of the specific entity type, that can adversely the server performance, and as such, is not recommended. Instead of this, the suggested solution to query all instance of a given entity type is to implement paging by sending multiple queries using the $top and $skip operator.

Instead of a fix URL we define only a formatting pattern with placeholders for the $skip and $top parameters for the subsequent queries. More about this format is available here.

By applying the $inlinecount=allpages in the query we can assure that the result is always returned in $result.d.results as described in this post.

In the script we assume that the default limit of 100 project / query has not been changed. Note, that we use the $firstBatch variable to decide, if we should append the data to the existing report file in the Export-Csv cmdlet, or to create a new one.

In the first version of the script we process the results (output them into a CSV) immediately after receiving the response.

  1. $urlPattern = 'http://YourProjectServer/PWA/_api/ProjectData/%5Ben-US%5D/Projects?$select=ProjectId,ProjectName,ProjectCreatedDate&$skip={0}&$top={1}&$inlinecount=allpages'
  2. $reportPath = "C:\Data\ProjServerReports\"
  4. $firstItem = 0
  5. $batchSize = 100
  6. $firstBatch = $true
  8. Do
  9. {
  10.   Write-Host Requesting next batch of $batchSize items, starting at $firstItem
  11.   $url = $urlPattern -f $firstItem, $batchSize
  12.   Write-Host $url
  13.   $request = [System.Net.WebRequest]::Create($url)
  14.   $request.UseDefaultCredentials = $true
  15.   $request.Accept = "application/json;odata=verbose"
  17.   $response = $request.GetResponse()
  18.   $reader = New-Object System.IO.StreamReader $response.GetResponseStream()
  19.   $data = $reader.ReadToEnd()
  22.   $result = ConvertFrom-Json -InputObject $data
  23.   $count = $result.d.results.Count
  24.   Write-Host Item count is $count
  25.   $result.d.results | select ProjectId, ProjectName, ProjectCreatedDate |
  26.       Export-Csv -Path ($reportPath + 'ProjectsFromProjectData_20171130_03.csv') -Delimiter ";" -Encoding UTF8 -NoTypeInformation -Append:(-not $firstBatch)
  27.   $firstBatch = $false
  28.   $firstItem += $batchSize
  29. }
  30. While ($count -eq $batchSize)

In the second we aggregate the results into an array, and process the results only after receiving the last response.

  1. $urlPattern = 'http://YourProjectServer/PWA/_api/ProjectData/%5Ben-US%5D/Projects?$select=ProjectId,ProjectName,ProjectCreatedDate&$skip={0}&$top={1}&$inlinecount=allpages'
  2. $reportPath = "C:\Data\ProjServerReports\"
  4. $results = @()
  5. $firstItem = 0
  6. $batchSize = 100
  8. Do
  9. {
  10.   Write-Host Requesting next batch of $batchSize items, starting at $firstItem
  11.   $url = $urlPattern -f $firstItem, $batchSize
  12.   Write-Host $url
  13.   $request = [System.Net.WebRequest]::Create($url)
  14.   $request.UseDefaultCredentials = $true
  15.   $request.Accept = "application/json;odata=verbose"
  17.   $response = $request.GetResponse()
  18.   $reader = New-Object System.IO.StreamReader $response.GetResponseStream()
  19.   $data = $reader.ReadToEnd()
  21.   $result = ConvertFrom-Json -InputObject $data
  22.   $count = $result.d.results.Count
  23.   Write-Host Item count is $count
  24.   $results += $result.d.results
  25.   $firstItem += $batchSize
  26. }
  27. While ($count -eq $batchSize)
  29. $results | select ProjectId, ProjectName, ProjectCreatedDate |
  30.       Export-Csv -Path ($reportPath + 'ProjectsFromProjectDataWithBatching.csv') -Delimiter ";" -Encoding UTF8 -NoTypeInformation

Although in this case both of the scripts have the same result, you can use the second version if you need the complete list of the projects at a later time, for example, you would like to extend it with data not available at the time of the query.

May 12, 2017

Getting a List of Checked-Out Projects from PowerShell via REST

Filed under: PowerShell, Project Server, REST — Tags: , , — Peter Holpar @ 05:22

We have an application running as a monthly scheduled batch process that updates enterprise fields of projects on our Project Server implementation based on values taken from various satellite systems. As a prerequisite, all affected projects should be checked-in. Projects checked-out to users won’t be updated. Of course, technically it would be possible to force check-in on those projects, but it was a business decision not to do that as we wanted to avoid data inconsistency by checking in a project that is not yet meant to be ready for that by the project manager.

Our application iterates through the projects, and if they are checked-in, it checks them out, updates the values, checks the project back and publishes it. If the project is checked-out, it sends a warning to the owner that the project was not updated due to its state. Occasionally project owners are doubting this warning, saying they are sure to check-in their projects, so I decided to create a sort-of report running just before starting the updating process to prove the list of projects left checked-out. It is something similar administrators already have on the Force Check-in Enterprise Objects page under PWA Settings.


Recently I wrote about how we can use PowerShell to create simple reports based on the data we query via the REST interface. This time I applied the very same technique to get the list of projects that are checked-out, including the name of the projects, the check-out description, checked-out time and the name and e-mail address of the user checked-out the project. The key was to assemble the REST query URL, including the $expand expression for the CheckedOutBy field.

$url = ‘http://YourProjectServerPWA/_api/ProjectServer/Projects?$expand=CheckedOutBy&$select=Name,CheckOutDescription,CheckedOutDate,CheckedOutBy/Title,CheckedOutBy/Email&$filter=IsCheckedOut’

$request = [System.Net.WebRequest]::Create($url)
$request.UseDefaultCredentials = $true
$request.Accept = ‘application/json;odata=verbose’

$response = $request.GetResponse()
$reader = New-Object System.IO.StreamReader $response.GetResponseStream()
$data = $reader.ReadToEnd()

$result = ConvertFrom-Json -InputObject $data

$result.d.results | % {
select -Input $_ -Prop `
    @{ Name=’Name’; Expression={$_.Name} },
    @{ Name=’User’; Expression={$_.CheckedOutBy.Title} },
    @{ Name=’EMail’; Expression={$_.CheckedOutBy.Email} },
    @{ Name=’Date’; Expression={[DateTime]::Parse($_.CheckedOutDate).ToString(‘g’)} },
    @{ Name=’Description’; Expression={$_.CheckOutDescription} }
    } | Export-Csv -Path CheckedOutProjects.csv -Delimiter ";" -Encoding UTF8 –NoTypeInformation

The result is a comma separated value (.csv) file, that one can open in Excel easily as well.

March 29, 2017

Working with the REST / OData Interface from PowerShell

Filed under: OData, PowerShell, REST, SP 2013 — Tags: , , , — Peter Holpar @ 20:56

If you follow my blog you might already know that I am not a big fan of the REST / OData interface. I prefer using the client object model. However there are cases, when REST provides a simple (or even the only available) solution.

For example, we are working a lot with PowerShell. If you are working with SharePoint on the client side at a customer, and you are not allowed to install / download / copy the assemblies for the managed client object model (CSOM), you have a problem.

Some possible reasons (you should know, that the SharePoint Server 2013 Client Components SDK is available to download as an .msi, or you can get the assemblies directly from an on-premise SharePoint installation):

  • You might have no internet access, so you cannot download anything from the web.
  • If you happen to have internet access, you are typically not allowed to install such things without administrator permissions on the PC. It’s quite rare case, if you or the business user you are working with has this permission.
  • You have no direct access on the SharePoint server, so you cannot copy the assemblies from it.
  • You are not allowed to use your own memory stick (or other storage device) to copy the assemblies from it.
  • Even if there is no technical barrier, company policies might still prohibit you using external software components like the CSOM assemblies.

In this case, using the REST interface is a reasonable choice. You can have a quick overview of the REST-based list operations here.

The main questions I try to answer in this post:

  • Which object should I use to send the request?
  • How to authenticate my request?
  • How to build up the payload for the request?

First of all, I suggest you to read this post to learn some possible pitfalls when working with REST URLs from PowerShell and how to avoid them with escaping.

Reading data with the SharePoint REST interface

Reading data with a GET request

Sending a GET request for a REST-based service in PowerShell is not really a challenge, might you think, and you are right, it is really straightforward most of the cases. But take the following example, listing the Id and Title fields of items in a list:

$listTitle = "YourList"
$url = "http://YourSharePoint/_api/Web/Lists/GetByTitle(‘$listTitle‘)/Items?`$select=Id,Title"

$request = [System.Net.WebRequest]::Create($url)
$request.UseDefaultCredentials = $true
$request.Accept = ‘application/json;odata=verbose’

$response = $request.GetResponse()
$reader = New-Object System.IO.StreamReader $response.GetResponseStream()
# ConvertFrom-Json : Cannot convert the Json string because a dictionary converted from it contains duplicated keys ‘Id’ and ‘ID’.
#$response = $reader.ReadToEnd()
$response = $reader.ReadToEnd() -creplace ‘"ID":’, ‘"DummyId":’

$result = ConvertFrom-Json -InputObject $response
$result.d.results | select Id, Title

If you would use

$response = $reader.ReadToEnd()

instead of

$response = $reader.ReadToEnd() -creplace ‘"ID":’, ‘"DummyId":’

then you became this exception, when trying to convert the JSON response:

ConvertFrom-Json : Cannot convert the Json string because a dictionary converted from it contains duplicated keys ‘Id’ and ‘ID’.

The reason, that the JSON response of the server contains the fields Id and ID. JSON is case-sensitive, but PowerShell is not, so it is an issue if you want to convert the JSON response to a PowerShell object. You can read more about it in this post, although I don’t like the solution proposed there. Although it really helps to avoid the error, but it uses the case insensitive replace operator instead of the case sensitive creplace, so it converts both fields into a dummy field. PowerShell seems to have no problem with the duplicated properties.

Instead of using a System.Net.WebRequest object, we can achieve a shorter version using the Invoke-RestMethod cmdlet. Note, that we don’t select and display the Id property in this case to avoid complications. See my comments about that in the next section discussing the POST request.

$listTitle = "YourList"
$url = "http://YourSharePoint/_api/Web/Lists/GetByTitle(‘$listTitle‘)/Items?`$select=Title"
$headers = @{ ‘Accept’ = ‘application/json; odata=verbose’}
$result = Invoke-RestMethod -Uri $url -Method Get -Headers $headers -UseDefaultCredentials
$result.d.results | select Title

Reading data with a POST request

There are cases when you have to use the POST method instead of GET to read some data from SharePoint. For example, if you need to filter the items via a CAML query. In the following example I show you how to query the file names all documents in a library recursively that are older than a threshold value:

$listTitle = "YourDocuments"
$offsetDays = -30

$urlBase = "http://YourSharePointSite/"
$urlAuth = $urlBase +"_api/ContextInfo"
$url = $urlBase + "_api/Web/Lists/GetByTitle(‘$listTitle’)/GetItems?`$select=FileLeafRef"

$viewXml = "<View Scope=’Recursive’><ViewFields><FieldRef Name=’Created’/><FieldRef Name=’FileLeafRef’/></ViewFields><Query><Where><Lt><FieldRef Name=’Created’ /><Value Type=’DateTime’><Today OffsetDays=’$offsetDays’ /></Value></Lt></Where></Query></View>"

$queryPayload = @{ 
                   ‘query’ = @{
                          ‘__metadata’ = @{ ‘type’ = ‘SP.CamlQuery’ };                      
                          ‘ViewXml’ = $viewXml
                 } | ConvertTo-Json

# authentication
$auth = Invoke-RestMethod -Uri $urlAuth -Method Post -UseDefaultCredentials
$digestValue = $auth.GetContextWebInformation.FormDigestValue

# the actual request
$headers = @{ ‘X-RequestDigest’ = $digestValue; ‘Accept’ = ‘application/json; odata=verbose’ }
$result = Invoke-RestMethod -Uri $url -Method Post -Body $queryPayload -ContentType ‘application/json; odata=verbose’ -Headers $headers –UseDefaultCredentials

# displaying results
$result.d.results | select FileLeafRef

Just for the case of comparison I include the same payload in JavaScript format:

var queryPayload = {
                     ‘query’ : {
‘__metadata’ : { ‘type’ : ‘SP.CamlQuery’ },
                         ‘ViewXml’ : viewXml

As you can see, these are the most relevant differences in the format we need in PowerShell:

  • We use an equal sign ( = ) instead of  ( : ) to separate the name and its value.
  • We use a semicolon ( ; ) instead of the comma ( , ) to separate object fields.
  • We need a leading at sign ( @ ) before the curly braces ( { ).

The Invoke-RestMethod tries to automatically convert the response to the corresponding object based on the content type of the response. If it is an XML response (see the authentication part above) then the result will be a XmlDocument. If it is a JSON response then the result will be a PSCustomObject representing the structure of the response. However, if the response can not be converted, it remains a single String.

For example, if we don’t limit the fields we need in response via the $select query option:

$url = $urlBase + "_api/Web/Lists/GetByTitle(‘$listTitle’)/GetItems"

then the response includes the fields Id and ID again. In this case we should remove one of these fields using the technique illustrated above with the simple GET request, before we try to convert the response via the ConvertFrom-Json cmdlet.

Note: If you still use PowerShell v3.0 you get this error message when you invoke Invoke-RestMethod setting the Accept header:

Invoke-RestMethod : The ‘Accept’ header must be modified using the appropriate property or method.
Parameter name: name

So if it is possible, you should consider upgrading to PowerShell v4.0. Otherwise, you can use the workaround suggested in this forum thread, where you can read more about the issue as well.

If you are not sure, which version you have, you can use $PSVersionTable.PSVersion to query the version number, or another option as suggested here.

Creating objects

In this case we send a request with the POST method to the server. The following code snippet shows, how you can create a new custom list:

$listTitle = "YourList"

$urlBase = "http://YourSharePoint/&quot;
$urlAuth = $urlBase +"_api/ContextInfo"
$url = $urlBase + "_api/Web/Lists"

$queryPayload = @{ 
                    ‘__metadata’ = @{ ‘type’ = ‘SP.List’ }; ‘AllowContentTypes’ = $true; ‘BaseTemplate’ = 100;
                    ‘ContentTypesEnabled’ = $true; ‘Description’ = ‘Your list description’; ‘Title’ = $listTitle                      
    } | ConvertTo-Json

$auth = Invoke-RestMethod -Uri $urlAuth -Method Post -UseDefaultCredentials
$digestValue = $auth.GetContextWebInformation.FormDigestValue

$headers = @{ ‘X-RequestDigest’ = $digestValue; ‘Accept’ = ‘application/json; odata=verbose’ }

$result = Invoke-RestMethod -Uri $url -Method Post -Body $queryPayload -ContentType ‘application/json; odata=verbose’ -Headers $headers –UseDefaultCredentials

The response we receive in the $result variable contains the properties of the list we just created. For example, the Id (GUID) of the list is available as $result.d.Id.

Updating objects

In this case we send a request with the POST method to the server and set the X-HTTP-Method header to MERGE. The following code snippet shows, how to change the title of the list we created in the previous step:

$listTitle = "YourList"

$urlBase = "http://YourSharePoint/&quot;
$urlAuth = $urlBase +"_api/ContextInfo"
$url = $urlBase + "_api/Web/Lists/GetByTitle(‘$listTitle’)"

$queryPayload = @{ 
                    ‘__metadata’ = @{ ‘type’ = ‘SP.List’ }; ‘Title’ = ‘YourListNewTitle’                      
    } | ConvertTo-Json

$auth = Invoke-RestMethod -Uri $urlAuth -Method Post -UseDefaultCredentials
$digestValue = $auth.GetContextWebInformation.FormDigestValue

$headers = @{ ‘X-RequestDigest’ = $digestValue; ‘Accept’ = ‘application/json; odata=verbose’; ‘IF-MATCH’ = ‘*‘; ‘X-HTTP-Method’ = ‘MERGE’ }

$result = Invoke-RestMethod -Uri $url -Method Post -Body $queryPayload -ContentType ‘application/json; odata=verbose’ -Headers $headers –UseDefaultCredentials

Deleting objects

In this case we send a request with the POST method to the server and set the X-HTTP-Method header to DELETE. The following code snippet shows, how you can delete a list item:

$listTitle = "YourList"

$urlBase = "http://YourSharePoint/&quot;
$urlAuth = $urlBase +"_api/ContextInfo"
$url = $urlBase + "_api/Web/Lists/GetByTitle(‘$listTitle’)/Items(1)"

# authentication
$auth = Invoke-RestMethod -Uri $urlAuth -Method Post -UseDefaultCredentials
$digestValue = $auth.GetContextWebInformation.FormDigestValue

# the actual request
$headers = @{ ‘X-RequestDigest’ = $digestValue; ‘IF-MATCH’ = ‘*’; ‘X-HTTP-Method’ = ‘DELETE’ }
$result = Invoke-RestMethod -Uri $url -Method Post -Headers $headers -UseDefaultCredentials

Note: Although the documentation states, that “in the case of recyclable objects, such as lists, files, and list items, this results in a Recycle operation”, based on my tests it is false, as the objects got really deleted.

Final Note: This one applies to all of the operations discussed in the post. If the SharePoint site you are working with available via HTTPS and there is an issue with the certificate, you can turn off the certificate validation, although it is not recommended in a production environment. You should include this line in your code before making any web requests:

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $true }

March 4, 2017

Using PowerShell and REST with Project Server (or SharePoint) for Reporting

Filed under: OData, Project Server, REST — Tags: , , — Peter Holpar @ 21:43

If you are working with Project Server or SharePoint Server, you should not ignore the potential provided by PowerShell and the REST (OData) interface to create simple reports. You should although at the same time be aware of  a few pitfalls of this combination as well.

Let’s see the next code example first. Its goal is to output the list of projects to the screen, including their Id, Name and ProjectSiteUrl properties:

$url = "http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,ProjectSiteUrl&quot;

$request = [System.Net.WebRequest]::Create($url)
$request.UseDefaultCredentials = $true
$request.Accept = "application/json;odata=verbose"

$response = $request.GetResponse()
$reader = New-Object System.IO.StreamReader $response.GetResponseStream()
$data = $reader.ReadToEnd()

$result = ConvertFrom-Json -InputObject $data
$result.d.results | select Id, Name, ProjectSiteUrl

If you test the URL http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,ProjectSiteUrl from the browser, you see, that all of these three properties are returned. However, if you run the above script from console, you find, that the ProjectSiteUrl column is empty for all of the projects.

If you use the ProjectData OData endpoint instead of the ProjectServer endpoint, and select the corresponding properties, all of the properties will be omitted by the script:

$url = "http://YourProjectServer/PWA/_api/Projects?$select=ProjectId,ProjectName,ProjectWorkspaceInternalUrl&quot;

$request = [System.Net.WebRequest]::Create($url)
$request.UseDefaultCredentials = $true
$request.Accept = "application/json;odata=verbose"

$response = $request.GetResponse()
$reader = New-Object System.IO.StreamReader $response.GetResponseStream()
$data = $reader.ReadToEnd()

$result = ConvertFrom-Json -InputObject $data
$result.d.results | select ProjectId, ProjectName, ProjectWorkspaceInternalUrl

Note: If you have a localized version of Project Server, you can either use an OData query including the localized entity and property names, like:


or switch back to the English version by injecting [en-US] segment after the ProjectData endpoint:


Of course, in the first case you should change the property names used in the select statement in the PowerShell script to match the names used in the REST query.

Let’s see another example. In the next case, our goal is to create a .csv file, that one can easily import to Excel, including the name and the RBS (resource breakdown structure) of the resources.

  1. $baseUrl = "http://YourProjectServer/PWA/_api/ProjectServer&quot;
  2. $rbsUrl = $baseUrl + "/LookupTables?$filter=Name eq 'RBS'&$expand=Entries&$select=Entries/InternalName,Entries/Value"
  3. $resourceUrl = $baseUrl + "/EnterpriseResources?$select=Name,Custom_000039b78bbe4ceb82c4fa8c0c400284"
  5. #rbs
  6. $rbsRequest = [System.Net.WebRequest]::Create($rbsUrl)
  7. $rbsRequest.UseDefaultCredentials = $true
  8. $rbsRequest.Accept = "application/json;odata=verbose"
  10. $rbsResponse = $rbsRequest.GetResponse()
  11. $rbsReader = New-Object System.IO.StreamReader $rbsResponse.GetResponseStream()
  12. $rbsData = $rbsReader.ReadToEnd()
  14. $rbsResult = ConvertFrom-Json -InputObject $rbsData
  15. $rsbEntries = $rbsResult.d.results.Entries.results
  17. #resources
  18. $resRequest = [System.Net.WebRequest]::Create($resourceUrl)
  19. $resRequest.UseDefaultCredentials = $true
  20. $resRequest.Accept = "application/json;odata=verbose"
  22. $resResponse = $resRequest.GetResponse()
  23. $resReader = New-Object System.IO.StreamReader $resResponse.GetResponseStream()
  24. $resData = $resReader.ReadToEnd()
  26. $resResult = ConvertFrom-Json -InputObject $resData
  28. $resResult.d.results | % {
  29. select -Input $_ -Prop `
  30.     @{ Name='Name'; Expression={$_.Name} },
  31.     @{ Name='RBS'; Expression={$rbs = $_.Custom_x005f_000039b78bbe4ceb82c4fa8c0c400284; If ($rbs.results -is [System.Object[]]) {$rsbEntries | ? { $_.InternalName -eq $rbs.results[0] } | % { $_.Value } } Else {''} } }
  32.     } | Export-Csv -Path ResourceRBS.csv -Delimiter ";" -Encoding UTF8 -NoTypeInformation

Note: The –NoTypeInformation switch of Export-Csv ensures that no type information would be emitted as header into the .csv file. The -Delimiter ";" and the -Encoding UTF8 settings help to produce a .csv file in a format and encoding that can be opened in Excel simply by clicking on the file.

The symptoms are similar as in the first case, only the resource name is included in the file, but the RBS value not.

I’ve included this last code sample in a code block not just because it is a bit longer as the former ones, but because I help that the highlighting helps you to understand the base problem with our scripts, even if you did not catch it at the first example. Have you recognized, that the query options ($filter, $select and $expand) have a different color, as the rest of the query text? Actually, they have the very same color as the variable names (like $baseUrl or $resRequest) in the code. It is because they are handled really as variable names. Since we used double quotes in the code to define the string literals for URLs, and it means PowerShell should parse the string and replace possible variable names with the values of the variable. As we didn’t define variables like $filter, $select or $expand, they are simply removed from the string (replaced by an empty string). See this short explanation for details.

Instead of double quotation marks we should use single quotation marks to leave the query options intact, but in this case we should escape the single quotes (using two single quotation marks) used in the REST query itself.

For example, instead of:

$url = "http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,ProjectSiteUrl&quot;

we should simply use:

$url = ‘http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,ProjectSiteUrl&#8217;

and instead of::

$rbsUrl = $baseUrl + "/LookupTables?$filter=Name eq ‘RBS’&$expand=Entries&$select=Entries/InternalName,Entries/Value"

we should use:

$rbsUrl = $baseUrl + ‘/LookupTables?$filter=Name eq ”RBS”&$expand=Entries&$select=Entries/InternalName,Entries/Value’

Note, that the value RBS is enclosed by two single quotation marks on both sides, and not by a double quotation mark!

Alternatively, you can use the double quotation marks to define the strings for the REST queries (for example, if you still would like PowerShell to parse it from some reason), but in this case, you should escape the dollar sign in the query options to disable parsing them out from the string.

For example, instead of:

$url = "http://YourProjectServer/PWA/_api/ProjectServer/Projects?$select=Id,Name,ProjectSiteUrl&quot;

we should simply use:

$url = "http://YourProjectServer/PWA/_api/ProjectServer/Projects?`$select=Id,Name,ProjectSiteUrl"

and instead of::

$rbsUrl = $baseUrl + "/LookupTables?$filter=Name eq ‘RBS’&$expand=Entries&$select=Entries/InternalName,Entries/Value"

we should use:

$rbsUrl = $baseUrl + "/LookupTables?`$filter=Name eq ‘RBS’&`$expand=Entries&`$select=Entries/InternalName,Entries/Value"

See this description for more details about PowerShell string parsing and escaping methods.

If you compare our first two examples (the one with the ProjectServer and the other one with the ProjectData endpoint) the results are different, because in the first case the ProjectSiteUrl property is not part of the standard set of properties returned by default for projects via the ProjectServer endpoint, but ProjectData returns all properties, the ProjectWorkspaceInternalUrl property too, even if it is not specified in a $select query option.

In the third case, our query should have returned the entries of the RBS lookup table, but since the query options got lost, it simply return an overview about all lookup tables.

August 29, 2016

Permission-based Rendering Templates, Part 2: The Synchronous Solution

Filed under: CSR, JavaScript, jQuery, REST, SP 2013 — Tags: , , , , — Peter Holpar @ 22:14

In my recent post I’ve illustrated how can you implement a permission-based custom rendering template using the JavaScript client object model (JSCOM)  and jQuery. That rendering template was implemented using the standard asynchronous JavaScript patterns via a callback method to not block the UI thread of the browser. In a fast network (in a LAN, for example) however, a synchronous implementation can function as well. Although there are some unsupported methods to make a JSCOM request synchronously, the JavaScript client object model was designed for asynchronous usage (see its executeQueryAsync method). To send our requests synchronously, we utilize the REST / OData interface in this post, and send the requests via the ajax function of jQuery.

To understand the original requirements and the configuration (field and list names, etc.), I suggest to read the first part first.

To enable using of jQuery selectors containing the dollar sign ($), we use the same escapeForJQuery helper function that we’ve created for the first part.

  1. var restrictedValues1 = ['Approved', 'Rejected'];
  2. var restrictedValues2 = ['Resubmit'];
  4. var custom = custom || {};
  6. custom.controlId = null;
  8. var adminGroup = "MyGroup";
  10. custom.escapeForJQuery = function (value) {
  11.     var newValue = value.replace(/\$/g, "\\$");
  12.     return newValue;
  13. }

Instead of simply wrapping the standard display template of choice fields (SPFieldChoice_Edit), the editFieldMethod function is responsible to get the HTML content of the field control, as it would be rendered without the customization by invoking the SPFieldChoice_Edit function, then we determine the group membership of the user by calling the synchronous isCurrentUserMemberOfGroup function (more about that a bit later), finally we alter the HTML content by hiding the adequate options by calling the hideOptions function (see it later as well).

  1. custom.editFieldMethod = function (ctx) {
  2.     var fieldSchema = ctx.CurrentFieldSchema;
  3.     custom.controlId = fieldSchema.Name + '_' + fieldSchema.Id + '_$DropDownChoice';
  4.     var html = SPFieldChoice_Edit(ctx);
  6.     var isCurrentUserInGroup = custom.isCurrentUserMemberOfGroup(adminGroup);
  7.     if (isCurrentUserInGroup) {
  8.         html = custom.hideOptions(html, custom.controlId, restrictedValues1);
  9.     }
  10.     else {
  11.         html = custom.hideOptions(html, custom.controlId, restrictedValues2);
  12.     }
  14.     return html;
  15. }

The hideOptions function loads the HTML source of the control into the DOM and removes the options that should be hidden for the given group. Finally it returns the HTML source of the altered control:

  1. custom.hideOptions = function (html, ctrlId, restrictedValues) {
  2.     var parsedHtml = $(html);
  3.     restrictedValues.forEach(function (rv) {
  4.         var selector = "#" + custom.escapeForJQuery(ctrlId) + " option[value='" + custom.escapeForJQuery(rv) + "']";
  5.         $(parsedHtml).find(selector).remove();
  6.     });
  7.     var result = $(parsedHtml).html();
  9.     return result;
  10. }

The isCurrentUserMemberOfGroup function sends a synchronous REST request via the the ajax function of jQuery to determine the group membership of the current user:

  1. var serverUrl = String.format("{0}//{1}", window.location.protocol,;
  3. custom.isCurrentUserMemberOfGroup = function (groupName) {
  4.     var isMember = false;
  6.     $.ajax({
  7.         url: serverUrl + "/_api/Web/CurrentUser/Groups?$select=LoginName",
  8.         type: "GET",
  9.         async: false,
  10.         contentType: "application/json;odata=verbose",
  11.         headers: {
  12.             "Accept": "application/json;odata=verbose",
  13.             "X-RequestDigest": $("#__REQUESTDIGEST").val()
  14.         },
  15.         complete: function (result) {
  16.             var response = JSON.parse(result.responseText);
  17.             if (response.error) {
  18.                 console.log(String.format("Error: {0}\n{1}", response.error.code, response.error.message.value));
  19.             }
  20.             else {
  21.                 var groups = response.d.results;
  22.                 groups.forEach(function (group) {
  23.                     var loginName = group.LoginName;
  24.                     console.log(String.format("Group name: {0}", loginName));
  25.                     if (groupName == loginName) {
  26.                         isMember = true;
  27.                     }
  28.                 });
  29.             }
  30.         }
  31.     });
  33.     return isMember;
  34. }

In this case we simply register the editFieldMethod for both the ‘EditForm’ and for the ‘NewForm’ mode of the Status field, there is no need for the OnPostRender method:

  1. var customOverrides = {};
  2. customOverrides.Templates = {};
  4. customOverrides.Templates.Fields = {
  5.     'Status': {
  6.         'EditForm': custom.editFieldMethod,
  7.         'NewForm': custom.editFieldMethod
  8.     }
  9. };
  11. SPClientTemplates.TemplateManager.RegisterTemplateOverrides(customOverrides);

The full source code of the rendering template introduced in this post:

  1. 'use strict';
  3. (function () {
  5.     var restrictedValues1 = ['Approved', 'Rejected'];
  6.     var restrictedValues2 = ['Resubmit'];
  8.     var custom = custom || {};
  10.     custom.controlId = null;
  12.     var adminGroup = "MyGroup";
  14.     custom.escapeForJQuery = function (value) {
  15.         var newValue = value.replace(/\$/g, "\\$");
  16.         return newValue;
  17.     }
  19.     custom.hideOptions = function (html, ctrlId, restrictedValues) {
  20.         var parsedHtml = $(html);
  21.         restrictedValues.forEach(function (rv) {
  22.             var selector = "#" + custom.escapeForJQuery(ctrlId) + " option[value='" + custom.escapeForJQuery(rv) + "']";
  23.             $(parsedHtml).find(selector).remove();
  24.         });
  25.         var result = $(parsedHtml).html();
  27.         return result;
  28.     }
  30.     custom.editFieldMethod = function (ctx) {
  31.         var fieldSchema = ctx.CurrentFieldSchema;
  32.         custom.controlId = fieldSchema.Name + '_' + fieldSchema.Id + '_$DropDownChoice';
  33.         var html = SPFieldChoice_Edit(ctx);
  35.         var isCurrentUserInGroup = custom.isCurrentUserMemberOfGroup(adminGroup);
  36.         if (isCurrentUserInGroup) {
  37.             html = custom.hideOptions(html, custom.controlId, restrictedValues1);
  38.         }
  39.         else {
  40.             html = custom.hideOptions(html, custom.controlId, restrictedValues2);
  41.         }
  43.         return html;
  44.     }
  46.     var serverUrl = String.format("{0}//{1}", window.location.protocol,;
  48.     custom.isCurrentUserMemberOfGroup = function (groupName) {
  49.         var isMember = false;
  51.         $.ajax({
  52.             url: serverUrl + "/_api/Web/CurrentUser/Groups?$select=LoginName",
  53.             type: "GET",
  54.             async: false,
  55.             contentType: "application/json;odata=verbose",
  56.             headers: {
  57.                 "Accept": "application/json;odata=verbose",
  58.                 "X-RequestDigest": $("#__REQUESTDIGEST").val()
  59.             },
  60.             complete: function (result) {
  61.                 var response = JSON.parse(result.responseText);
  62.                 if (response.error) {
  63.                     console.log(String.format("Error: {0}\n{1}", response.error.code, response.error.message.value));
  64.                 }
  65.                 else {
  66.                     var groups = response.d.results;
  67.                     groups.forEach(function (group) {
  68.                         var loginName = group.LoginName;
  69.                         console.log(String.format("Group name: {0}", loginName));
  70.                         if (groupName == loginName) {
  71.                             isMember = true;
  72.                         }
  73.                     });
  74.                 }
  75.             }
  76.         });
  78.         return isMember;
  79.     }
  81.     var customOverrides = {};
  82.     customOverrides.Templates = {};
  84.     customOverrides.Templates.Fields = {
  85.         'Status': {
  86.             'EditForm': custom.editFieldMethod,
  87.             'NewForm': custom.editFieldMethod
  88.         }
  89.     };
  91.     SPClientTemplates.TemplateManager.RegisterTemplateOverrides(customOverrides);
  93. })();

Assuming your custom list is called PermBasedField, and both jQuery (in my case it is jquery-1.9.1.min.js) and our custom JavaScript rendering template (in my case it’s called permissionBasedFieldTemplate2.js) are stored in the root of the Site Assets library of the root web, you can register the template using the following PowerShell script:

$web = Get-SPWeb http://YourSharePointSite
$list = $web.Lists["PermBasedField"]

$field = $list.Fields.GetFieldByInternalName("Status")
$field.JSLink = "~sitecollection/SiteAssets/jquery-1.9.1.min.js|~sitecollection/SiteAssets/permissionBasedFieldTemplate2.js"

Note, that (in contrast to the script introduced in the first part of this post) there is no need for the JSCOM JavaScript files (sp.runtime.js and sp.js) in this case.

July 26, 2016

Creating Project Server Enterprise Resources via REST

Filed under: Enterprise Resources, JavaScript, PS 2013, REST — Tags: , , , — Peter Holpar @ 21:27

Recently I’ve got a question on this blog about how to create Project Server enterprise resources using the REST interface. Although I consider this task to be a rather common requirement, there are really very few (if any) practical information about that on the web.

In this post I show you how to perform this operation using C# and JavaScript as well.

In the C# example I assume you have the following using statement in your code:

using System.Net;

Furthermore, it is assumed you have a string field called baseUrl in your class that contains the URL of your PWA site.

private readonly string baseUrl = "http://YourProjectServer/PWA&quot;;

I’m using some helper methods, a few of them are borrowed from the code of this SharePoint StackExchange forum thread.

First we need to get a form digest value we will use in our later POST requests. This task is performed by the GetFormDigestValue method.

  1. private string GetFormDigestValue()
  2. {
  3.     string digest = null;
  5.     HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(this.baseUrl + "/_api/contextinfo");
  6.     request.Method = "POST";
  7.     request.ContentType = "application/json;odata=verbose";
  8.     request.UseDefaultCredentials = true;
  9.     var postBody = string.Empty;
  10.     request.ContentLength = postBody.Length;
  12.     byte[] postData = Encoding.ASCII.GetBytes(postBody);
  14.     using (Stream requestStream = request.GetRequestStream())
  15.     {
  16.         requestStream.Write(postData, 0, postData.Length);
  17.         requestStream.Close();
  19.         HttpWebResponse response = (HttpWebResponse)request.GetResponse();
  20.         using (Stream responseStream = response.GetResponseStream())
  21.         {
  22.             var encoding = ASCIIEncoding.ASCII;
  23.             using (var reader = new StreamReader(response.GetResponseStream(), encoding))
  24.             {
  25.                 // parse the ContextInfo response
  26.                 var resultXml = XDocument.Parse(reader.ReadToEnd());
  28.                 // get the form digest value
  29.                 var d = from e in resultXml.Descendants()
  30.                         where e.Name == XName.Get("FormDigestValue", ";)
  31.                         select e;
  32.                 digest = d.First().Value;
  33.             }
  34.         }
  35.     }
  37.     return digest;
  38. }

The POST requests will be prepared by the PreparePostRequest method before execution:

  1. private HttpWebRequest PreparePostRequest(string requestPath, string formDigestValue, string postBody)
  2. {
  3.     HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(this.baseUrl + requestPath);
  4.     request.Method = "POST";
  5.     request.ContentType = "application/json;odata=verbose";
  6.     request.Accept = "application/json;odata=verbose";
  7.     request.Headers.Add("X-RequestDigest", formDigestValue);
  8.     request.UseDefaultCredentials = true;
  9.     request.ContentLength = postBody.Length;
  11.     byte[] postData = Encoding.ASCII.GetBytes(postBody);
  13.     System.IO.Stream requestStream = request.GetRequestStream();
  14.     requestStream.Write(postData, 0, postData.Length);
  15.     requestStream.Close();
  17.     return request;
  18. }

The ExecuteAuthorizedPostRequest gets the form digest and prepares the request using the other two methods, finally sends the request and reads the response returned by the server.

  1. private HttpWebResponse ExecuteAuthorizedPostRequest(string requestPath, string postBody)
  2. {
  3.     string formDigestValue = GetFormDigestValue();
  4.     HttpWebRequest request = PreparePostRequest(requestPath, formDigestValue, postBody);
  6.     HttpWebResponse response = (HttpWebResponse)request.GetResponse();
  8.     return response;
  9. }

Using the methods above, we can concentrate on the actual “mission”, creating a new enterprise resource. In the code below I’ve included three different versions (the first two of them are commented out).

The first one creates an enterprise resource by setting its Name property only. By default, the new resources are created having EnterpriseResourceType.Work as their ResourceType property.

In the second example we create an enterprise resource by setting its Name property and its ResourceType property to EnterpriseResourceType.Material.

In the third case we create an enterprise resource by setting its Name property and its CostCenter property.

  1. public void CreateEnterpriseResource()
  2. {
  3.     string requestPath = "/_api/ProjectServer/EnterpriseResources";
  4.     // creating a new resource setting only its name
  5.     //string postBody = "{ '__metadata': { 'type': 'PS.EnterpriseResource' }, 'Name':'New Resource' }";
  6.     // creating a new resource setting its name and type ('Material' in this case)
  7.     //string postBody = string.Format("{{ '__metadata': {{ 'type': 'PS.EnterpriseResource' }}, 'Name':'New Material Resource', 'ResourceType': '{0}' }}", (int)EnterpriseResourceType.Material);
  8.     // creating a new resource setting its name and cost center
  9.     string postBody = string.Format("{{ '__metadata': {{ 'type': 'PS.EnterpriseResource' }}, 'Name':'New CC Resource', 'CostCenter': '{0}' }}", "Your Cost Center");
  11.     var resp = ExecuteAuthorizedPostRequest(requestPath, postBody);
  12. }

Note, that the type of the object we use in the examples is PS.EnterpriseResource and not PS.EnterpriseResourceCreationInformation one would use to create a new resource via the client object model. Since the EnterpriseResourceCreationInformation object has only a limited set of properties provided by the EnterpriseResource object, we can set a lot more properties on creating the resource, than we could using the client object model. For example, in the last example above we set the cost center of the enterprise resource when creating it. It would not be possible via the EnterpriseResourceCreationInformation object.

If you need to create the resources from JavaScript on a web page instead of C#, it is possible as well.

First, we define a String.format helper function:

  1. String.format = (function () {
  2.     // The string containing the format items (e.g. "{0}")
  3.     // will and always has to be the first argument.
  4.     var theString = arguments[0];
  6.     // start with the second argument (i = 1)
  7.     for (var i = 1; i < arguments.length; i++) {
  8.         // "gm" = RegEx options for Global search (more than one instance)
  9.         // and for Multiline search
  10.         var regEx = new RegExp("\\{" + (i – 1) + "\\}", "gm");
  11.         theString = theString.replace(regEx, arguments[i]);
  12.     }
  14.     return theString;
  15. });

The JavaScript equivalent (using jQuery) our three examples we saw above in C# the code:

  1. var serverUrl = String.format("{0}//{1}", window.location.protocol,;
  2. var pwaServerRelUrl = "/PWA";
  4. function startScript() {
  5.     $.ajax({
  6.         url: serverUrl + pwaServerRelUrl + "/_api/ProjectServer/EnterpriseResources",
  7.         type: "POST",
  8.         contentType: "application/json;odata=verbose",
  9.         //data: JSON.stringify(item),
  10.         // creating a new resource setting only its name
  11.         //data: "{ '__metadata': { 'type': 'PS.EnterpriseResource' }, 'Name':'New Resource JS' }",
  12.         // creating a new resource setting its name and type ('Material' in this case)
  13.         //data: "{ '__metadata': { 'type': 'PS.EnterpriseResource' }, 'Name':'New Material Resource JS', 'ResourceType': '2' }",
  14.         // creating a new resource setting its name and cost center
  15.         data: "{ '__metadata': { 'type': 'PS.EnterpriseResource' }, 'Name':'New CC Resource JS', 'CostCenter': 'Your Cost Center' }",
  16.         headers: {
  17.             "Accept": "application/json;odata=verbose",
  18.             "X-RequestDigest": $("#__REQUESTDIGEST").val()
  19.         },
  20.         complete: function (result) {
  21.             var response = JSON.parse(result.responseText);
  22.             if (response.error) {
  23.                 console.log(String.format("Error: {0}\n{1}", response.error.code, response.error.message.value));
  24.             }
  25.             else {
  26.                 console.log(String.format("Resource created with ID: {0}", response.d.Id));
  27.             }
  28.         }
  29.     });
  30. }
  32. $(document).ready(startScript);

Assuming this script is saved in the file CreateResource.js in the Site Assets library of the PWA site, and the jquery-1.9.1.min.js can be found in that library as well, you can add a Script Editor web part to your page, and include these two lines in the web part to inject the resource creation functionality into the page:


Of course, you typically don’t want such automatic resource creation on page load in a real-world scenario. Instead that, you should probably add some text fields for the resource properties (like name, cost center) and a button a page, to submit the data and create the resources using the parameters the user entered in the text fields. The plain goal of the example above is to show how the request for creating a new enterprise resource looks like, and how to send it to the server.

May 16, 2016

How to Copy a Document between Folders of a SharePoint Document Library using VBA and REST

Filed under: REST, SP 2013, VBA — Tags: , , — Peter Holpar @ 15:59

One of our customers wanted to copy Office documents (.docx, .xlsx, .xlsm) from a folder of a SharePoint document library into another one using Excel macros. The Excel document, that contains the macro, is located in the root of the document library.

For some mysterious reason, that we could not have really identified, the files were created having a size of 0 bytes at the target location for these types of files, when they were copied using the “classical” methods (more on these methods a bit later), although other file types, like text or image files could be copied without problem.

So what are those classical methods, most of them available already in other blogs or forum threads?

Version 1

Copy the files using the FileSystemObject.

You should add a reference to the Windows Script Host Obejct Model library in VBA.

Note: We convert the URL of the document library into an UNC form in the ConvertPath method. That means, it converts a URL like http://YourSharePoint/DocLib into \\YourSharePoint\DocLib. However, if you have configured HTTPS for your SharePoint, you need to convert the URL into this form: \\YourSharePoint@SSL\DavWWWRoot\DocLib. In this case, you should either extend the ConvertPath method, or simply use a fix path in your code as a quick and dirty solution.

Function ConvertPath(path) As String
  ConvertPath = Replace(path, " ", "%20")
  ConvertPath = Replace(ConvertPath, "/", "\")
  ConvertPath = Replace(ConvertPath, "http:", "")
End Function

Private Sub CopyFiles1()
  Dim sDocPath As String
  Dim sDocPathConv As String
  Dim sFileName As String
  Dim sTargetPath As String
  Dim sSourcePath As String
  Dim fso As FileSystemObject

  Set fso = New FileSystemObject ‘ CreateObject("Scripting.FileSystemObject")
  sDocPath = ThisWorkbook.Path
  sFileName = "WorkBook.xlsx"

  sDocPathConv = ConvertPath(sDocPath)

  sSourcePath = sDocPathConv  & "\Folder1\" & sFileName
  Debug.Print "Source: " & sSourcePath
  sTargetPath = sDocPathConv  & "\Folder2\" & sFileName
  Debug.Print "Target: " & sTargetPath

  fso.CopyFile sSourcePath, sTargetPath, True

End Sub

Version 2

Copy the files using SharePoint document library as mapped drive using the FileSystemObject.

In addition to the Windows Script Host Obejct Model library, you need an additional reference to the WSHControllerLibrary as well.

See a similar sample here.

The MapNetworkDrive method seems to handle the conversion of the SharePoint doc. lib. URL into an UNC form, but you might still need to invoke the conversion method if you receive this error on mapping the drive:

800704DC – The operation being requested was not performed because the user has not been authenticated

Private Sub CopyFiles2()
  Dim sDocPath As String
  Dim sFileName As String
  Dim sTargetPath As String
  Dim sSourcePath As String
  Dim sDriveLetter As String
  Dim fso As FileSystemObject
  Dim net As WshNetwork

  ‘ drive letter should be available (not mapped to a share already, to avoid error ‘80070055 – The local device is already in use’)
  sDriveLetter = "S:"
  sFileName = "WorkBook.xlsx"
  Set fso = New FileSystemObject ‘ CreateObject("Scripting.FileSystemObject")
  sDocPath = ThisWorkbook.Path

  ‘sDocPath = ConvertPath(sDocPath)

  Set net = New WshNetwork ‘ CreateObject("WScript.Network")
  Debug.Print "Path to map: " & sDocPath
  net.MapNetworkDrive sDriveLetter, sDocPath

  sSourcePath = sDriveLetter & "\Folder1\" & sFileName
  Debug.Print "Source: " & sSourcePath

  sTargetPath = sDriveLetter "\Folder2\" & sFileName
  Debug.Print "Target: " & sTargetPath

  fso.CopyFile sSourcePath, sTargetPath, True

  net.RemoveNetworkDrive sDriveLetter

  Set net = Nothing
  Set fso = Nothing

End Sub

Version 3

We could have downloaded the file, and upload it via web service calls, but I felt this second part simply far too complex.

Version 4

We have used a method to create temporary folder names:

Private Declare Function GetTempPath Lib "kernel32" Alias "GetTempPathA" (ByVal nBufferLength As Long, ByVal lpBuffer As String) As Long

Private Declare Function GetTempFileName Lib "kernel32" Alias "GetTempFileNameA" _
  (ByVal lpszPath As String, _
  ByVal lpPrefixString As String, _
  ByVal wUnique As Long, _
  ByVal lpTempFileName As String) As Long

Public Function Get_Temp_File_Name( _
  Optional sPrefix As String = "VBA", _
  Optional sExtension As String = "") As String

  Dim sTmpPath As String * 512
  Dim sTmpName As String * 576
  Dim nRet As Long
  Dim F As String

  nRet = GetTempPath(512, sTmpPath)
  If (nRet > 0 And nRet < 512) Then
    nRet = GetTempFileName(sTmpPath, sPrefix, 0, sTmpName)
    If nRet <> 0 Then F = Left$(sTmpName, InStr(sTmpName, vbNullChar) – 1)
    If sExtension > "" Then
      Kill F
      If Right(F, 4) = ".tmp" Then F = Left(F, Len(F) – 4)
      F = F & sExtension
    End If

    Get_Temp_File_Name = F
  End If

End Function

Then, instead of copying directly between the document library folders in Version 2 and 3, we copied the file first from source folder to the local temporary file in the file system, then from the temporary file to the target folder.

That means, instead of

fso.CopyFile sSourcePath, sTargetPath, True

we used this:

Dim tempPath As String

tempPath = Get_Temp_File_Name
Debug.Print "Temp path: " & tempPath

fso.CopyFile sSourcePath, tempPath, True
fso.CopyFile tempPath, sTargetPath, True
fso.DeleteFile tempPath

It did not help to remedy our problem with the empty Office files created during the copy operation.

Version 5

This is our last sample code, and it is the one that works at the customer without problem with the file size / content. In this case we utilize the getfilebyserverrelativeurl and copyto methods of the files and folders REST API.

Using this method has a further benefit, that – depending on the file size – might be even a significant one. In contrast with the other methods described earlier, this one does not download / upload the file content. It sends only a command to the server to copy the file, after we perform the authentication in the GetDigest method.

For this example to work, you need a reference to the Microsoft XML, v6.0 library in VBA.

Private Function GetDigest(url As String)
  Dim http As MSXML2.XMLHTTP

  Set http = New MSXML2.XMLHTTP

  http.Open "POST", url + "/_api/contextinfo", False
  http.setRequestHeader "ContentType", "application/json;odata=verbose"
  http.send ""

  GetDigest = http.responseXML.SelectSingleNode("//d:FormDigestValue").nodeTypedValue
  Set http = Nothing

End Function

Private Sub CopyFiles3()
  Dim webAppUrl As String
  Dim serverRelUrlOfSite As String
  Dim siteUrl As String
  Dim docLibName As String
  Dim serverRelUrlOfDocLib As String
  Dim sourcePath As String
  Dim targetPath As String

  Dim http As MSXML2.XMLHTTP
  Dim digest As String
  Dim url As String

  webAppUrl = "http://YourSharePoint&quot;
  serverRelUrlOfSite = "/subsite1/subsite1.2"
  docLibName = "YourDocLib"
sFileName = "WorkBook.xlsx"

  siteUrl = webAppUrl & serverRelUrlOfSite
  serverRelUrlOfDocLib = serverRelUrlOfSite & "/" & docLibName

  sourcePath = "/Folder1/" & sFileName
  Debug.Print "Source: " & sourcePath

  targetPath = "/Folder2/" & sFileName"
  Debug.Print "Target: " & targetPath

  ‘ get the authentication digest
  digest = GetDigest(siteUrl)
  Set http = New MSXML2.XMLHTTP

  url = siteUrl & "/_api/web/getfilebyserverrelativeurl(‘" & serverRelUrlOfDocLib & sourcePath & "’)/copyto(strnewurl=’" & serverRelUrlOfDocLib & targetPath & "’,boverwrite=true)"

  http.Open "POST", url, False
  http.setRequestHeader "X-RequestDigest", digest

  http.send ""
  Set http = Nothing

End Sub

If you need to move the files instead of copying, you should simply use the MoveTo method instead of the CopyTo method.

Note: As you see, this code does not contain any error handling, so please extend it if you would like to use it in production.

March 24, 2015

How to Read the Values of Fields bound to Lookup Tables via REST

Filed under: JavaScript, Project Server, REST — Tags: , , — Peter Holpar @ 22:51

In my recent post I’ve illustrated how to read the values of Enterprise Custom Fields (ECT) that are bound to Lookup Tables. I suggest you to read that post first, as it can help you to better understand the relations between the custom field values and the internal names of the lookup table entries.

In this post I show you how to read such values using the REST interface. Instead of C# I use JavaScript in this example. The sample code is using the version 3.0.3-Beta4 of the LINQ for JavaScript library (version is important, as this version contains lower case function names in contrast to the former stable relases!) and the version 1.8.3 of jQuery.

Assuming that these scripts are all located in the PSRESTTest/js subfolder in the Layouts folder, we can inject them via a Script Editor Web Part using this HTML code:

<script type="text/ecmascript" src="/_layouts/15/PSRESTTest/js/jquery-1.8.3.min.js"></script>
<script type="text/ecmascript" src="/_layouts/15/PSRESTTest/js/linq.min.js"></script>
<script type="text/ecmascript" src="/_layouts/15/PSRESTTest/js/GetCustFieldREST.js"></script>

In our GetCustFieldREST.js script we define the String.format helper function first:

String.format = (function () {
    // The string containing the format items (e.g. "{0}")
    // will and always has to be the first argument.
    var theString = arguments[0];

    // start with the second argument (i = 1)
    for (var i = 1; i < arguments.length; i++) {
        // "gm" = RegEx options for Global search (more than one instance)
        // and for Multiline search
        var regEx = new RegExp("\\{" + (i – 1) + "\\}", "gm");
        theString = theString.replace(regEx, arguments[i]);

    return theString;

Another helper function supports sending fault-tolerant REST-queries:

function sendRESTQuery(queryUrl, onSuccess, retryCount) {

    var retryWaitTime = 1000; // 1 sec.
    var retryCountMax = 3;
    // use a default value of 0 if no value for retryCount passed
    var retryCount = (retryCount != undefined) ? retryCount : 0;

    //$.support.cors = true; // enable cross-domain query
        //beforeSend: function (request) {
        //    request.withCredentials = false;
        type: ‘GET’,
        //xhrFields: { withCredentials: false },
        contentType: ‘application/json;odata=verbose’,
        url: baseUrl + queryUrl,
        headers: {
            ‘X-RequestDigest’: $(‘#__REQUESTDIGEST’).val(),
            "Accept": "application/json; odata=verbose"
        dataType: "json",
        complete: function (result) {
            var response = JSON.parse(result.responseText);
            if (response.error) {
                if (retryCount <= retryCountMax) {
                    window.setTimeout(function () { sendRESTQuery(queryUrl, onSuccess, retryCount++) }, retryWaitTime);
                else {
                    alert("Error: " + response.error.code + "\n" + response.error.message.value);
            else {
                bgDetails = response.d;


The baseUrl variable holds the root of the REST endpoint for Project Server. I assume your PWA site is provisioned to the PWA managed path.

var baseUrl = String.format("{0}//{1}/PWA/_api/ProjectServer/", window.location.protocol,;

We call the getResCustProp method when the page is loaded:


In the getResCustProp method I first query the InternalName of the custom field, as well as the InternalName and the FullValue properties of the lookup entries of the corresponding lookup table. In a second query I read the custom field value for the specified enterprise resource, and compare the value (or values) stored in the field with the InternalName property of the lookup table entries from the first query. Note, that we should escape the underscore in the InternalName property, and should use the ‘eval’ JavaScript function, as we don’t know the name of the custom field (that is the name of the property) at design time.

function getResCustProp() {
    var resourceName = "Brian Cox";
    var fieldName = "ResField";
    var fieldQueryUrl = String.format("CustomFields?$expand=LookupTable/Entries&$filter=Name eq ‘{0}’&$select=InternalName,LookupTable/Entries/InternalName,LookupTable/Entries/FullValue", fieldName);
    sendRESTQuery(fieldQueryUrl, function (fieldResponseData) {
        var field = fieldResponseData.results[0];
        var lookupTableEntries = Enumerable.from(field.LookupTable.Entries.results);
        var resourceQuery = String.format("EnterpriseResources?$filter=Name eq ‘{0}’&$select={1}", resourceName, field.InternalName)
        sendRESTQuery(resourceQuery, function (resourceResponseData) {
            var resource = resourceResponseData.results[0];
            var encodedInternalName = field.InternalName.replace(‘_’, ‘_x005f_’);
            var fieldValue = eval("resource." + encodedInternalName);
            Enumerable.from(fieldValue.results).forEach(function (fv) {
                var entry = lookupTableEntries.first(String.format(‘$="{0}"’, fv));
                alert(String.format("Value: ‘{0}’, Entry: ‘{1}’", entry.FullValue, fv));

Of course, if you would like to use the code in production, you should add further data validation (if there is any resource and custom field returned by the queries, etc.) and error handling to the method.

February 19, 2014

Fault-tolerant REST Queries

Filed under: jQuery, REST, SP 2010, SP 2013 — Tags: , , , — Peter Holpar @ 22:15

Recently I faced again with the problem, that as the scripts on my web pages submit REST queries after a longer pause or an IISRESET (that is, when the IIS application pools must be started), I receive on of the following errors on the first try:

‘Microsoft.SharePoint.DataService.ListNameItem’ does not have a property named ‘FieldName’. (SharePoint 2010, HTTP 400 – Bad Request in the browser)
An error occurred while processing this request.  (SharePoint 2010, HTTP 500 – Internal Server Error in the browser)
No property ‘FieldName’ exists in type ‘Microsoft.SharePoint.Linq.DataServiceEntity’ at position 0. (SharePoint 2013, HTTP 400 – Bad Request in the browser)

When one submits the query via the browser, only the HTTP 400 / HTTP 500 error codes are displayed. You can see the detailed error message only via a network traffic analyzer tool, like Fiddler.

The subsequent queries (even using the same expression as originally) usually succeed. If we submits multiple queries at the same time (for example via a script on a page), it might happen that several of these queries fail with the same error.

How to avoid this kind of failures in a web application, when it is especially important to prevent the erratic behavior?

Using jQuery we can create a function like the one below to re-submit the query automatically and silently in case of errors. I applied a time-gap of 1 second in the sample to give the system time to wake up and a maximum number of 3 retries. If even after the 3rd retry we encounter an error, we simply display it to the user.

// alter the value to match your site
var baseUrl = ‘http://yoursharepoint/yoursite/_vti_bin/listdata.svc/’;

function sendRESTQuery(queryUrl, onSuccess, retryCount) {
    var retryInterval = 1000; // 1 sec.
    var retryCountMax = 3;
    // use a default value of 0 if no value for retryCount passed
    var retryCount = (retryCount != undefined) ? retryCount : 0;

        type: ‘GET’,
        contentType: ‘application/json;odata=verbose’,
        url: baseUrl + queryUrl,
        headers: {
            ‘X-RequestDigest’: $(‘#__REQUESTDIGEST’).val(),
            "Accept": "application/json; odata=verbose"
        dataType: "json",
        complete: function (result) {               
            var response = JSON.parse(result.responseText);
            if (response.error) {
                // here you can include your custom logic to differentiate for example based on the error types
                if (retryCount <= retryCountMax) {
                    window.setTimeout(function () { sendRESTQuery(queryUrl, onSuccess, retryCount++) }, retryInterval);
                else {
                    alert("Error: " + response.error.code + "\n" + response.error.message.value);
            else {
                bgDetails = response.d;

Assuming we have a HTML page with a SELECT element having id = “dropItemList”, and a SharePoint list called YourList, the following code snippet demonstrates, how to fill up the options of the SELECT with the list items:

var dropItemListSelector = "#dropItemList";

function initDropItemList() {
    var queryUrl = "YourList()?$select=Id,Title";
    sendRESTQuery(queryUrl, function (responseData) {
        var items = responseData.results;
        $(dropItemListSelector).find(‘option’).remove().end().append(‘<option value="0">Please choose an item!</option>’);
        Enumerable.from(items).orderBy("$.Title").forEach(function (x) {

In the previous example I used the linq.js ver.3.0.4-Beta5 to assemble my LINQ query. Version of the library is important!

July 23, 2013

Creating a Flash Video Player For SharePoint using the built-in web parts – Version 1

Filed under: CEWP, jQuery, REST, SP 2010 — Tags: , , , — Peter Holpar @ 20:51

Recently I had to provide our users a way to display a list of Flash video files (FLV) with the ability to play the selected one. I found a similar solution on CodePlex, however, I did not like the idea to install a custom web part just to fulfill this requirement, and implemented instead my custom lightweight solutions using the standard SharePoint web parts. In this post I show you the implementation based on the Content Editor Web Part (CEWP), in a later post I publish another version based on the Content Query Web Part (CQWP).

It’s common in the implementations that they utilize Flowplayer version 3.2.16. I created a folder called flowplayer on the SharePoint server at 14\TEMPLATE\LAYOUTS, and from the flowplayer folder in copied the following files into the new folder:

(from the example subfolder)

The Flash video files are stored in this case in a document library called “VideoLib1”. I set the Title property of the files to provide a short description of the content beyond the file name (Name property).


Additionally, I needed jQuery version 1.8.3, json2.js (to support JSON.parse in the Quirks document mode of Internet Explorer, like the WSS 3.0 compatible master page of SharePoint 2010) and the LINQ for JavaScript (ver.3.0.3-Beta4) library (version is important due to the incompatible syntax of former versions). All of these JavaScript libraries were located – for the sake of simplicity – in the jQuery folder in Layouts.

I’ve created a new web part page, added a new CEWP to the page, and set the source code of the web part according to the sample below:

Code Snippet
  1. <script src="/_layouts/Flowplayer/flowplayer-3.2.12.min.js"></script>
  2. <script src="/_layouts/jQuery/jquery-1.8.3.min.js"></script>
  3. <script src="/_layouts/jQuery/json2.js"></script>
  4. <script src="/_layouts/jQuery/linq.js"></script>
  5. <script src="/_layouts/jQuery/linq.jquery.js"></script>
  7. <link rel="stylesheet" type="text/css" href="/_layouts/Flowplayer/style.css">
  8. <span><a style="display:block;width:520px;height:330px" class="myPlayer"></a></span>
  9. <div class="videoList" style="text-align:left;"></div>
  11. <script language="ecmascript">
  13.     function startClip(src, title) {
  14.         flowplayer().play([
  15.     { url: src, title: title }
  16. ]);
  17.     }
  19.     $(document).ready(startScript);
  22.     function startScript() {
  23.         // initialize FlowPlayer in the 'myPlayer' anchor (A) HTML tag
  24.         flowplayer("a.myPlayer", "/_layouts/Flowplayer/flowplayer-3.2.16.swf");
  26.         // ctx assumes a ListViewWebPart (it might be even hidden) on the same page
  27.         var siteUrl = ctx.HttpRoot;
  29.         // clear the video list DIV (videoList)
  30.         $('.videoList').empty();
  32.         $.ajax({
  33.             type: 'GET',
  34.             contentType: 'application/json;odata=verbose',
  35.             // get the (file)Name and Title property of each .FLV file
  36.             url: siteUrl + "/_vti_bin/listdata.svc/VideoLib1?$filter=endswith(Name,'.flv')&$select=Name,Title",
  37.             headers: {
  38.                 "Accept": "application/json; odata=verbose"
  39.             },
  40.             dataType: "json",
  41.             complete: function (result) {
  42.                 var response = JSON.parse(result.responseText);
  43.                 if (response.error) {
  44.                     alert("Error: " + response.error.code + "\n" + response.error.message.value);
  45.                 }
  46.                 else {
  47.                     var videos = response.d.results;
  48.                     Enumerable.from(videos).forEach(function (x) {
  49.                         // on click event will be the video clip started
  50.                         var link = '<p><a href="javascript:void(0)" title ="' + x.Name.substring(0, x.Name.length – 4) + '" onclick="startClip(\'' + x.__metadata.media_src + '\', \'' + x.Title + '\')">' + x.Title + '</a></p>';
  51.                         // append video link to the video list DIV (videoList)
  52.                         $('.videoList').append(link);
  53.                     });
  54.                 }
  55.             },
  56.             error: function (xmlHttpRequest, textStatus, errorThrown) {
  57.                 alert(errorThrown);
  58.             }
  59.         });
  60.     }
  61. </script>


In this script we request the Name and Title properties of the .flv files from our document libraries through the REST API, and add the required links to the play list. Alternatively one could use the Client OM as well, but in this case I found REST to be simpler.

To be able to send the REST request, we need the URL of the current SharePoint web. The JavaScript variable ctx referred to in my script is available only we have at least one ListViewWebPart on the same page. If you don’t have any, you can add a dummy  one (for example, the movie document library itself), and set it to be hidden, as illustrated below:


If you don’t like this hack, you can get the URL through the Client OM as well, through an extra asynchronous call.

The image below illustrates the end result rendered by the CEWP:


Clicking on the title of the movie will start the video playback.

You can read about a similar solution utilizing CQWP here.

Older Posts »

Create a free website or blog at