Powershell foreach

The powershell foreach loop must be one of the most powerful tools in any scripting toolkit. It allows you to easily repeat a list of actions multiple times.

below is a classic example that highlights the power of the foreach loop:

$servers = "Srv101","Srv102","Srv103"

foreach ($server in $servers)
     invoke-command -ComputerName $server -ScriptBlock
         Install-WindowsFeature  BitLocker

         New-Item HKLM:SOFTWARE/Policies/Microsoft/FVE
         Set-Location HKLM:SOFTWARE/Policies/Microsoft
         Set-ItemProperty FVE -Name UseAdvancedStartup -Value 1
         Set-ItemProperty FVE -Name EnableBDEWithNoTPM -Value 1

In the above script we effectively install Bitlocker, configure the registry to allow encryption without the use of a TPM chip and reboot the computer to make sure the bit locker install completes properly. On 3 machines using one script!

When using the foreach loop you’ll often be also using arrays, so here are some tips on using arrays:

quick and dirty integer array using simple comma notation:

$myarray = 1,2,3

more elaborate syntax achieving the same result:

$myarray = @(1,2,3)

if you need something more elaborate, then I suggest you look into ConvertFrom-Json which will give you a very powerful way to manage configurations and multiple arrays.

Config Values for Powershell Scripts

In this post I will share my favourite config approach to Powershell scripts

O ften you’ll find yourself writing one set of scripts that will be used for several machines or environments. Most of the commands will be the same for each server, just a few changes needed. If you’ve used a scripted installation tool for SharePoint, such as autospinstaller, then you’ll be all too familiar with using a config file. Autospinstaller uses an xml file to store its config values. while that’s ok, I find it becomes very hard to read and decypher. I much prefer using JSON notation instead.  Here is an example to get you started:

Create a config.json file using your favourite text editor. notice following key syntax elements:

  • All parameter names and values need to be in quotes. only integer values can/must skip the quotes.
  • The whole content needs to be wrappen in one set of curlies {}
  • arrays of sub objects, as in the disks array uses square brackets []
  • each sub object needs to be embedded in curlies {}
   "domain": "DMZ",
   "Name": "SQL01",
   "Role": "SQL",
   "IP" : "",
   "DNS" : "",
   "GateWay" : "",
   "Description": "DMZ SQL Server 2016",
   "Disks": [
     "ID": 0,
     "Name": "OS",
     "SizeInBytes": 274877906944,
     "Partitions": [
        "ID": 0,
        "Letter": "C",
        "Size": 261872,
        "Label": "OS"
     "ID": 1,
     "Name": "Data",
     "SizeInBytes": 549755813888,
     "Partitions": [
       "ID": 0,
       "Letter": "D",
       "Size": 524288,
       "Label": "Data"

then you can import it into your powershell script, best via a command parameter. InitVM.ps1, and start addressing the values using simple dot notation:

$config = Get-Content -Raw -Path $configPath | ConvertFrom-Json
 New-NetIPAddress –InterfaceAlias "Ethernet" –IPAddress $config.IP –PrefixLength 24 -DefaultGateway $config.Gateway
 Set-DnsClientServerAddress –InterfaceAlias "Ethernet" -ServerAddresses ($config.DNS) 
 Rename-Computer -NewName $config.Name -Restart 

foreach ($disk in $config.Disks | Where {$_.ID -gt 0})
   Set-Disk $disk.ID -isOffline $false
   Get-Disk $disk.ID | Initialize-Disk -PartitionStyle GPT
   foreach ($partition in $disk.Partitions)
      $sizeInBytes = $partition.Size * 1048576 
      New-Partition -DiskNumber $disk.ID -Size $sizeinBytes -DriveLetter $partition.Letter | Format-Volume -FileSystem NTFS -NewFileSystemLabel $partition.Label

The above script is quite useful if you’re building machines from scratch. It first sets the IP address, with Gateway and DNS, then renames the computer. finally it initialises the disks and formats the volumes based on your configuration.

Need to restart the machine to make the name change stick though. so if you plan on doing more, you might want to consider splitting tasks into multiple scripts. thanks to your new config file approach, all you need to do is re-import the config and continue where you left off.

That’s it, now you can execute the script using a simple parameter and use a different config for each server/environment/farm:

InitVM.ps1 -configPath config.json

Ultimate SharePoint Tool Kit

Thank you to Tobias Zimmergren for his great post on SharePoint 2013 tools. I’ve taken some of his suggestions and added some of my own to this list.

check out his original post here:


 SharePoint Provisioning Tools

Using schema definition files to provision SharePoint artefacts is becoming a dying practice. two highly talented development teams have decided to solve the issues you would face with versioning and re-deployment of sharepoint artefacts using the traditional approach. Since SP2013 supports ever richer CSOM abilities, we can now take full control of deploying our artefacts, version control them, handle upgrades elegantly and repackage solutions more efficiently than ever before.

Some of the issues that the following libraries solve:

  • create site collections with multiple sub sites in one go
  • re-deploy changes to lists and libraries while content is present
  • apply advanced settings to lists and libraries that are not available in the schema definitions
  • apply security logic
  • apply business logic while provisioning changes to a site

SharePoint PnP (Patterns and Practices)

started with the Provisioning Engine for 2007 which would be used to create Publishing Portals and has evolved to a fully functional templating and provisioning engine for any sharepoint artefact



driven by TDD, the author of this package wanted to have a provisioning framework that was ROBUST. Tired of SharePoint stumbling over its own feet every time it tried to redeploy a list, library, content type they built a code driven provisioning framework from scratch. pretty impressive, but comes with a steep learning curve


 Visual Studio Enhancements

CKS Dev Tools

Must have for rapid testing of ghosted changes. careful, some changes require an IISreset and/or re-deployment. some don’t. you’ll figure it out fast enough 🙂


Debugging Assistant

allows you to be more specific as to what process you’re recycling. speeds up the IISreset process. and believe me. you want to speed that up.


ULs logs add-in

this brings the uls logs to your IDE. have not worked with it, but it looks worthy of a good trial



Analyses your code on the fly and applies rules that will highlight where you are breaking standards and conventions. you can add/remove your own rules to adapt your own standards to the tool. Also has built in refactoring capabilities to clean up your code with a few clicks.



The SharePoint Code Analysis Framework is your SharePoint version of ReSharper which will check for SharePoint specific Gotchas. very useful to keep your CSOM in shape!


 Standalone tools you Must have

CAML Designer

You do not want to learn CAML inside out. you’ll rather eat a frog for brekky. Using an automated tool to translate “give me all items from shared documents where the department is IT and created is within the last week” in a few seconds will make CAML a JOY!


SharePoint Manager

Has gotten me out of very tight spots in the past when I by accident deployed very bad CAML causing endless grief. with this tool you navigate your way through the object model and can not only read out values but modify them in some instances. Also very useful to extract Field definitions and content type schema from your prototyping sites on the fly.


SharePoint Client Browser

SPM does not work with Office365, as it reuquires the sharepoint server dlls to be loaded in GAC.  So if you want to look behind the curtains on your online tenant, or don’t have local admin access tot he SP box, then try out the SP2013 client browser


(you’re allowed to look, but not touch!)

SharePoint Log Viewers

this one is my favourite as it supports filtering and live logging, what it lacks is the ability to copy values from the list and filter by selection.


SharePoint Search tool

great way to learn by trial and error how to build search requests against the search API and manage the response that comes back to you.



once started, this tool will log ALL http/s traffic originating from your computer and the responses. great tool to debug responses, errors, webs service calls, REST url format, claims issues and SAML tokens



when you are completely stuck and need to figure out how SharePoint works behind the scenes, Reflector will show you a C# or VB representation of the sharepoint dlls by reading the IL(Intermediate Language) from the .net assemblies and presenting the logic in human readable form. Super useful if you need to find a workaround to a SharePoint bug and can’t wait for the hotfix to fix it.


Telnet Client

by adding/removing windows features you can add the telnet client to your command line. I’ve found this tool great to do very quick connectivity tests when things don’t seem to want to talk to each other.

i.e. test connection to the corporate SMTP server on port 25, test the SQL connection on port 1433, etc.

simply open a command prompt after you installed it and type: telnet and see if it opens a comms channel. if it does, you’re good to go. if it times out, check your firewalls,  and configurations. maybe SQL is listening on a non standard port, maybe your exchange server won’t accept your SharePoint server. BTW. look for EHLO on my blog to see some instructions on creating a test email message using Telnet. Useful to see if relay is properly configured on your main smtp server.

 PowerShell Tools

Visual Studio Add-In

Planning on writing lots of powershell tools and want to keep them in Source Control? VS would be a great place to start. Sadly it does not do Highlighting natively. this tool will make writing Powershell in VS a breeze:



Dell have now bought up powergui, still a very nifty tool for writing and debugging powershell scripts


SharePoint Installer

ALL installations should use this tool to install:


 REST tools

Chrome add-in: Advanced REST client

Great tool to quickly test your REST commands. Especially when you want to experiment with POST, PUT and DELETE commands



I’ve been using this tool for several months now and it’s pretty slick! allows you to build a collection of HTTP requests that you can re-execute on a regular basis. allows you to build the input using JSON highlighting and shows the response also using different highlihgting options. the feature that makes it pretty awesome is its TEST client which will allow you to build assert statements in javascript to build your own “unit” tests using the REST calls.

it also comes with NewMan, which is the command line test runner which you can then execute automatically on your favourite build server (Jenkins comes to mind)



this nifty tool will intercept all smtp traffic that was coming from your local sharepoint server (if you installed smtp on the machine and configured SP to use the local SMTP instance and not your corporate email server that is. Great to figure out if SharePoint is properly configured and also great when you’re testing new workflows but don’t want to bombard the users with emails. As this tool will hijack the emails being sent for you.

 JavaScript debuggers

by now all browsers have pretty solid JS, HTML and CSS debugging capabilities. Just press F12 in your favourite browser and start experimenting with the DOM.

some tips: you can put break points into JS functions, breakpoints into inline JS.

But the thing I use most: being able to tweak css styles and style attributes on the fly in the browser. Nothing better than sitting with the customer and making colour and positioning changes on the canvas in seconds, then recording those tweaks and baking them back into the solution.

SharePoint Min Roles

Why the SharePoint 2016 MinRoles make Max Sense

Microsoft introduced the MinRole concept to the SharePoint 2016 farm setup: https://technet.microsoft.com/EN-US/library/mt346114%28v=office.16%29.aspx

You had the choice of adding a server of type “FrontEnd”,”Distributed Cache”,”Search”, or “Application”

So if you wanted to build a farm using Min Roles and ensure all components were available you needed a minimum of 4 Servers. So MS gave us a 5th option: “custom” which effectively bypasses the MinRole concept and allows you to provision anything you want on that server.

That has lead to many farms consisting of two servers, a FrontEnd and a custom which hosts the rest. This tended to overload the custom server a bit and was less than optimal. but adding any new services to the front end would make it non-compliant!

in November last year MS introduced Feature Pack 1 for SharePoint (just a prettier name than Service Pack) that not only fixed a whole range of cumulative bugs but also introduced new features, such as two new roles:

FrontEnd+Cache and Search+App.

Yay! so now we can build small 2 server farms using the MinRole approach and balance the load better across the two servers. plus having the cache as close as possible to the user-endpoint makes a lot of performance sense.


Why MinRoles?

Out of years of troubleshooting SharePoint farm stability issues I can add my 2c worth:

The distributed cache does not like anyone stealing its RAM. this service, while crucial for operating a stable farm is one of the most fickle and instable components I have come across from MS. if you try co-locating it on a search server you’ll have endless grief as both will be competing for the same precious RAM. forget about Dynamic Memory allocation. I mean it. Forget it even exists. If you want more than one server in your distributed cache, then make sure both have the same spec. any change in how memory is made available to either machine can bring down the cache.

Verdict: if you can, host the distributed cache on its own server(s) (goes for 2013 farms too!). never on the search server, and if co-locating it, then only on the front end server using the new MinRole option.

Search will eat up all resources you can throw at it. RAM, CPU & Disk. It’s the hungriest beast in the whole farm and will bring to a grinding halt any other services you try to co-locate on the server. The crawler will chew through CPU cycles and RAM when performing its crawl cycle, which on continuous crawl is about ALL THE TIME. IO on the Disks will be hammered with the creation and compilation of the search index and your drives will be filling up nicely based on how many TB of content your search server is indexing.

Verdict: always host the search crawl, admin, content processing and analytics components on a separate server to the rest of the farm. you can co-locate those search topology components if your load and patterns allow for it, or break them out onto separate servers. plan your disk space wisely for the server(s) hosting the index component and feel free to co-locate the query component on the front end servers

That leaves us with the IIS Web App components (Front End) and the remaining service applications. in SharePoint 2016 the Application Role and the Front End Role are very similar, the both host most of the required components to serve content to the users. The main difference is that the front end servers have been optimised for low latency and the application server for high throughput.

low latency means that the front end servers will benefit from hosting the service applications locally instead of being forced to connect to service apps on another machine.

With this configuration you do wonder why you should need the app role at all? all it does is host the application discovery and load balancing components and a small hand full of extra services.

Verdict: With the new App+Search role you can now easily co-host the app role on the search server for small farms, eliminating a whole server for the paltry few services that the app role runs.