PowerShell

PNP PowerShell: Maintain all your Termset data across tenants

The Term store manager available in SharePoint enables companies to manage their enterprise-specific taxonomy easily through new term groups and term sets. This metadata can then be referenced by users for selecting choices when filling in profile or content-related data.

Enterprise taxonomies can sometime contain dozens of groups with too many term sets and terms to manage, update of copy manually. There are standard ways offered to export taxonomies into a .csv file and importing them to term store on a different tenant.

But what if you want to not only export term sets and term labels but also their other term-specific data and configuration such as:

  • Localised labels for each term
  • Synonyms
  • Navigation settings (if the term set is intended for site navigation)
  • Custom properties associated with each term
  • The exact GUID of the term

The above data may not be interesting for users, however for administrators and content creators and developers these additional elements of a term can are very important.

Fortunately, we can export all the term sets configuration using the powerful and very useful PNP PowerShell cmdlets

Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community, we now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

Specifically, the cmdlet that we can use is:

Export-PNPTermGroupToXml – Enables us to export a Term group and all its underlying terms’ setting to an xml-based output file

ImportPnPTermGroupFromXml – Enables us to import a Term group from an XML file

Export your taxonomy

To use the cmdlet, I first need enter credentials connect to my SPO tenant content type hub site collection:

Once connected I simply need to pass an output file and the term group name I want to export

Looking at the exported XML you can see that all the relevant term settings included GUID are now available to import to another term store

Importing your taxonomy

The import is done in a similar manner.

Connect to the destination tenant

Pass the XML file as a parameter as seen below

That’s it!

PNP PowerShell: Managing Content Type Artefacts across a single or multiple Office 365 tenants

Creating content types in Sharepoint has always been relatively easy for site and content administrators. Furthermore, with the Content Type Hub feature, custom content types can be centrally defined and pushed out to all site collections. The challenges and difficulties, however arise when you want to make some inherent changes to these site objects or want these exact site objects to be present across your DTAP (Dev, Test, Acceptance & Production

For instance,

  • I’ve created my custom content types in my dev tenant. Now I want to migrate the changes to production?
  • How can I update an internal name of a field with a content type and ensure that the changes are reflected everywhere?

Actions like these were (and still are) generally avoided because there’s be no good way of accomplishing them. It’s still very good practice to thoroughly prepare and review what’s needed before creating custom content types. Making changes to these artefacts still requires effort especially when there is content that is already using these artefacts.

Fortunately, the ability to manage existing content types has gotten easier. Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community.

We now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

You can go through the PnP Cmdlet documentation here https://github.com/SharePoint/PnP-PowerShell/tree/master/Documentation

I want to focus on creating content types and managing changes to these artefacts you use the following 2 PNP cmdlets

Get-PnPProvisioningTemplate: Enables you to extract a site template with all or a partial set of the site artifacts. The extraction is an xml file which can be reviewed and updated

Apply-PnPProvisionTemplate: Enables you to apply an extracted site template to an existing site. Essentially providing you with a means to apply changes how to all sites in a tenant or a different tenant

The overall process then would look like this:

Create custom artefacts in content type hub

As usual create your fields and content types in the content type hub. I recommend to:

  • Tag these artefacts in a custom group so they are easily identifiable
  • Decide on a naming convention for both fields and content types that helps others to see that these are custom artefacts
  • Avoid spaces in field names when initially creating them. Otherwise you end up with internal names looking like this

Where the space is replaced with a hexadecimal “_x0020_”. This is not a critical issue, however can be avoided and corrected.

I’ve created a content type in a unique group:

With a custom field Document Category

Extract artefacts using Get-PnPProvisioningTemplate:

Using the cmdlet, I can first enter credentials connect to my SPO tenant content type hub site collection

Then extract only the Fields and Content Types using the -Handler attribute

Make changes to your artefacts in XML

In your xml output file, you will find all the Fields and Content Types. You search for the relevant ones by looking for the group name (“Demo Only” in my case)

You can now edit field properties such as the StaticName and Name

Be sure to update the reference to the update field name in the corresponding content types as well. In my case I had created a “Demo Content type”

Modified to

Once your satisfied with you changes save the XML file and you are ready to apply the changes to the original content type hub site collection

Apply changes using

Connect to your content type hub site collection again:

Run the Apply-PnPProvisioningTemplate with the updated xml file as an input:

I changed the static name of “Document_x0020_Category” to “Document_Category” which is not reflected in when viewing the field column URL:

This was a simple demonstration of the scripting tools available to manage site artefacts change that previously were difficult or impossible to update.

Changes can now be pushed out to all site collections by republishing the updated content type:

Using this same technique, with a bit more preparation you can also extract a set of custom content types for one tenant and apply them to another. Thereby keeping field names, content types and their internal GUIDs all intact!

Powershell: Publishing all files in a SharePoint Online library programmatically

One of our clients build up a library of 500+ documents. After these were modified (added meta data and the content went through several rounds of corrections), we were asked to mass publish all files so the site could go live. Which leaves us with 2 options; 1. manually check-in, publish, approve all files. 2. add some CSOM & PowerShell together in a file and do it programmatically. Off course, I, Mark Overdijk, chose to persue the second option. I asked Massimo Prota to assist in getting a script ready. The first version of the script turned out rather usefull, so I added some extra features and more out-put to & interaction with the user. This latest version is generic enough so it can be re-used.

Features

- No limitation on # of files for a list/library - Added code to filter which files should be published - User will be promoted for password and confirmation - Feedback to user on screen - If a file is checked out, the script will check in before proceeding - If Content Approval is enabled for the list/library, the script will approve the file - Screen out-put will be saved to a txt file which includes the current date/time in the filename

Prerequisites Powershell

Step 1. Gather parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to publish/approve the files

.ListName: This is the Title of the list for which you want to publish/approve the files

.UserName: This is the UserName that has enough permissions to publish/approve the files

Step 2. Run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

[code language="powershell"] #################################### # Script: PublishFilesSPO.ps1 # # Version: 2.0 # # Rapid Circle (c) 2016 # # by Mark Overdijk & Massimo Prota # ####################################

# Clear the screen Clear-Host

# Add Wave16 references to SharePoint client assemblies - required for CSOM Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"

# Parameters # Specify the subsite URL where the list/library resides $SiteUrl = "https://DOMAIN.sharepoint.com/SUBSITE" # Title of the List/Library $ListName = "TITLE" # Username with sufficient publish/approve permissions $UserName = "USER@DOMAIN.com" # User will be prompted for password

# Set Transcript file name $Now = Get-date -UFormat %Y%m%d_%H%M%S $File = "PublishFilesSPO_$Now.txt" #Start Transcript Start-Transcript -path $File | out-null

# Display the data to the user Write-Host "/// Values entered for use in script ///" -foregroundcolor cyan Write-Host "Site: " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green Write-Host "List name: " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green Write-Host "Useraccount: " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green # Prompt User for Password $SecurePassword = Read-Host -Prompt "Password" -AsSecureString Write-Host "All files in " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green -nonewline; Write-Host " on site " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green -nonewline; Write-Host " will be published by UserName " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green Write-Host " "

# Prompt to confirm Write-Host "Are these values correct? (Y/N) " -foregroundcolor yellow -nonewline; $confirmation = Read-Host

# Run script when user confirms if ($confirmation -eq 'y') {

# Bind to site collection $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $Context.Credentials = $credentials

# Bind to list $list = $Context.Web.Lists.GetByTitle($ListName) # Query for All Items $query = New-Object Microsoft.SharePoint.Client.CamlQuery $query.ViewXml = " " $collListItem = $list.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $Context.Load($List) $Context.Load($collListItem) $Context.ExecuteQuery()

# Go through process for all items foreach ($ListItem in $collListItem){ # Adding spacer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " # Write the Item ID, the FileName and the Modified date for each items which is will be published Write-Host "Working on file: " -foregroundcolor yellow -nonewline; Write-Host $ListItem.Id, $ListItem["FileLeafRef"], $ListItem["Modified"]

# Un-comment below "if" when you want to add a filter which files will be published # Fill out the details which files should be skipped. Example will skip all files which where modifed last < 31-jan-2015 # # if ( # $ListItem["Modified"] -lt "01/31/2015 00:00:00 AM"){ # Write-Host "This item was last modified before January 31st 2015" -foregroundcolor red # Write-Host "Skip file" -foregroundcolor red # continue # }

# Check if file is checked out by checking if the "CheckedOut By" column does not equal empty if ($ListItem["CheckoutUser"] -ne $null){ # Item is not checked out, Check in process is applied Write-Host "File: " $ListItem["FileLeafRef"] "is checked out." -ForegroundColor Cyan $listItem.File.CheckIn("Auto check-in by PowerShell script", [Microsoft.SharePoint.Client.CheckinType]::MajorCheckIn) Write-Host "- File Checked in" -ForegroundColor Green } # Publishing the file Write-Host "Publishing file:" $ListItem["FileLeafRef"] -ForegroundColor Cyan $listItem.File.Publish("Auto publish by PowerShell script") Write-Host "- File Published" -ForegroundColor Green

# Check if the file is approved by checking if the "Approval status" column does not equal "0" (= Approved) if ($List.EnableModeration -eq $true){ # if Content Approval is enabled, the file will be approved if ($ListItem["_ModerationStatus"] -ne '0'){ # File is not approved, approval process is applied Write-Host "File:" $ListItem["FileLeafRef"] "needs approval" -ForegroundColor Cyan $listItem.File.Approve("Auto approval by PowerShell script") Write-Host "- File Approved" -ForegroundColor Green } else { Write-Host "- File has already been Approved" -ForegroundColor Green } } $Context.Load($listItem) $Context.ExecuteQuery() } # Adding footer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " Write-Host "Script is done" -ForegroundColor Green Write-Host "Files have been published/approved" -ForegroundColor Green Write-Host "Thank you for using PublishFilesSPO.ps1 by Rapid Circle" -foregroundcolor cyan Write-Host " " } # Stop script when user doesn't confirm else { Write-Host " " Write-Host "Script cancelled by user" -foregroundcolor red Write-Host " " } Stop-Transcript | out-null ############################## # Rapid Circle # # http://rapidcircle.com.au # ############################## [/code]

PowerShell: Terminate a workflow for all items in a list on SharePoint Online

This is a follow up on our previous post "PowerShell: Start a workflow for all items in a list on SharePoint Online". As it's great that now there's a script available to start a workflow for all items, it would also be great to have the ability to stop all workflows if necessary. So I, Mark Overdijk, got to work again with Massimo Prota to get this script in place. The script is very similar to the StartWorkflow Powershell script, but the difference is that we don't retrieve the workflow through WorkflowAssociations but we have to use WorkflowInstances.

Prerequisites Powershell

Step 1. Gather parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to run the workflow

.ListName: This is the Title of the list for which you want to run the workflow

.UserName: This is the UserName that has enough permissions to run the workflow

Step 2. run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

Copy/Paste the code below in a txt file and save as an *.ps1 file (in this example "StopWorkflow.ps1"). Fill out the parameters with the gathered information and run the script.

PowerShell stop workflow
PowerShell stop workflow

[code language="powershell"] # Add Wave16 references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.WorkflowServices.dll"

# Specify tenant admin and site URL $SiteUrl = "https://[TENANT].sharepoint.com/" $ListName = "[TITLE OF THE LIST]" $UserName = "[USERNAME]" $SecurePassword = Read-Host -Prompt "Enter password" -AsSecureString

# Bind to site collection $ClientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $ClientContext.Credentials = $credentials $ClientContext.ExecuteQuery()

# Get List $List = $ClientContext.Web.Lists.GetByTitle($ListName)

$ClientContext.Load($List) $ClientContext.ExecuteQuery()

$ListItems = $List.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $ClientContext.Load($ListItems) $ClientContext.ExecuteQuery()

# Create WorkflowServicesManager instance $WorkflowServicesManager = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager($ClientContext, $ClientContext.Web)

# Connect to WorkflowSubscriptionService $WorkflowSubscriptionService = $WorkflowServicesManager.GetWorkflowSubscriptionService()

# Connect WorkflowInstanceService instance $WorkflowInstanceService = $WorkflowServicesManager.GetWorkflowInstanceService()

$ClientContext.Load($WorkflowServicesManager) $ClientContext.Load($WorkflowSubscriptionService) $ClientContext.Load($WorkflowInstanceService) $ClientContext.ExecuteQuery()

# Get WorkflowAssociations with List $WorkflowAssociations = $WorkflowSubscriptionService.EnumerateSubscriptionsByList($List.Id) $ClientContext.Load($WorkflowAssociations) $ClientContext.ExecuteQuery()

# Prepare Terminate Workflow Payload $EmptyObject = New-Object System.Object $Dict = New-Object 'System.Collections.Generic.Dictionary[System.String,System.Object]'

# Loop Terminate Workflow For ($j=0; $j -lt $ListItems.Count; $j++){

$msg = [string]::Format("Killing workflows {0} on ListItemID {1}", $WorkflowAssociations[0].Name, $ListItems[$j].Id) Write-Host $msg

$itemWfInstances = $WorkflowInstanceService.EnumerateInstancesForListItem($List.Id, $ListItems[$j].Id) $ClientContext.Load($itemWfInstances) $ClientContext.ExecuteQuery() for ($k=0;$k -lt $itemWfInstances.Count;$k++) { try { $WorkflowInstanceService.TerminateWorkflow($itemWfInstances[$k]) $msg = "Worfklow terminated on " + $ListItems[$j].Id $ClientContext.ExecuteQuery() } catch { $msg = "Error terminating workflow on " + $ListItems[$j].Id + " Details: $_" }

Write-Host $msg } } [/code]

PowerShell: Start a workflow for all items in a list on SharePoint Online

For one of our Office 365 clients (mix of E1 and E3 licences) we created a workflow which will check the status of an item and, depending on this status, sends out e-mails and updates other columns. As the list was already in use, it was necessary to start the workflow for all present items. But to start the workflow manually for all 477  items, was not preferable. So I, Mark Overdijk, asked Massimo Prota to help me on the quest to see if it would be possible to to do it via PowerShell. As there are no PowerShell commands available for SharePoint Online to access the workflow instance, we searched for CSOM solutions. We came across this script on github. Thanks to Azam-A we had a base script to work from. What we changed/added were the following;

  • Referenced the new wave16 components as Office 365 is already on wave16.
  • Added feedback in the script when it runs. It'll show for each item the item ID for which the script is starting the workflow.
  • For obvious security reasons we're not storing the user's Admin password as plain text, but prompt for the password.

Prerequisites Powershell

Step 1. Gather required parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to run the workflow

.ListName: This is the Title of the list for which you want to run the workflow

.UserName: This is the UserName that has enough permissions to run the workflow

 

Step 2. Run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

Copy/Paste the code below in a txt file and save as an *.ps1 file (in this example "StartWorkflow.ps1"). Fill out the parameters with the gathered information and run the script.

StartWorkflow1 screenshot in  Powershell

[code language="powershell"] # Add Wave16 references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll") Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll") Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.WorkflowServices.dll")

# Specify tenant admin and site URL $SiteUrl = "https://[TENANT].sharepoint.com/" $ListName = "[TITLE OF THE LIST]" $UserName = "[USERNAME]" $SecurePassword = Read-Host -Prompt "Enter password" -AsSecureString

# Connect to site $ClientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $ClientContext.Credentials = $credentials $ClientContext.ExecuteQuery()

# Get List and List Items $List = $ClientContext.Web.Lists.GetByTitle($ListName) $ListItems = $List.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $ClientContext.Load($List) $ClientContext.Load($ListItems) $ClientContext.ExecuteQuery()

# Retrieve WorkflowService related objects $WorkflowServicesManager = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager($ClientContext, $ClientContext.Web) $WorkflowSubscriptionService = $WorkflowServicesManager.GetWorkflowSubscriptionService() $WorkflowInstanceService = $WorkflowServicesManager.GetWorkflowInstanceService() $ClientContext.Load($WorkflowServicesManager) $ClientContext.Load($WorkflowSubscriptionService) $ClientContext.Load($WorkflowInstanceService) $ClientContext.ExecuteQuery() # Get WorkflowAssociations with List $WorkflowAssociations = $WorkflowSubscriptionService.EnumerateSubscriptionsByList($List.Id) $ClientContext.Load($WorkflowAssociations) $ClientContext.ExecuteQuery()

# Prepare Start Workflow Payload $Dict = New-Object 'System.Collections.Generic.Dictionary[System.String,System.Object]'

# Loop List Items to Start Workflow For ($j=0; $j -lt $ListItems.Count; $j++){ $msg = [string]::Format("Starting workflow {0}, on ListItemId {1}", $WorkflowAssociations[0].Name, $ListItems[$j].Id) Write-Host $msg #Start Workflow on List Item $Action = $WorkflowInstanceService.StartWorkflowOnListItem($WorkflowAssociations[0], $ListItems[$j].Id, $Dict) $ClientContext.ExecuteQuery() }[/code]

If, for some reason, you want to stop/terminate all workflows, check this blogpost: PowerShell: Terminate a workflow for all items in a list on SharePoint Online

Walkthrough: Add Geolocation column to your list in Office 365

A while ago a client (with an Office 365 E3 subscription) came to us with the wish to create a map to plot locations of external contractors on. My first thoughts, as an Office 365 consultant, went towards using the tools at hand. SharePoint 2013/Online has a Geolocation column type and the list view type "Map view". The client agreed to use this feature and I went about setting up the solution. So I posed the self-fulfilling prophecy: "How hard can it be?"...

As the list with the data was already in place, I was neither keen on letting a developer create a solution which either creates a new list with the column in it (and me migrating data) nor writing a solution which adds the column programmatically once. I wanted to add the column directly through (a reusable) script and went on to do my desk research. This ended up taking way too much time as almost all information found…

  • ...were solutions for SharePoint 2013 on premise,
  • ...were articles on the end-result,
  • ...posted failing scripts,
  • ...did not offer information on the Bing Maps key,
  • ...did not offer guides/information specifically for Office 365/SharePoint Online scenario's.

Something as simple as "what to use as the Bing Maps application URL for an Office 365 tenant?" was not to be found.

It took a while, but when I finally got the settings right for a Bing Maps key and a working script, I decided on 2 things;

  1. Create a generic script, because as a consultant I'll want to use this script more than once for multiple tenants.
  2. Write a blog post as a definitive guide to add the geolocation column type in Office 365/SharePoint Online as a resource for the community

Scenario

For the walkthrough I'm using the following scenario; As a global admin for the tenant https://yourcompany.sharepoint.com, I'm adding the geolocation column type to the list "Contact" on the sub site https://yourcompany.sharepoint.com/sites/sales and naming the column "Office".

Step 1. Get a Bing Maps Key

Go to Bings Maps Dev Center; https://www.bingmapsportal.com/

Log in with your Live account (@live.com, @outlook.com, etcetera) or create one to gain access.

Go to My account > Create or view keys

bingmapsdev1
bingmapsdev1

To create a new API key follow the "Click here to create a new key" hyperlink

bingmapsdev2
bingmapsdev2
bingmapsdev3
bingmapsdev3

Fill out the form to create your API key

. Application name: The name you would like to use for your key. It helps you to identify the key in your overview

. Application URL: The URL of your root SharePoint portal (https://tenant.sharepoint.com)

. Key type (Trial/Basic): Choose whether you're using the key for 1) a test site (max 10,000 calls p/mth and max 90-day trial period) or 2) a live site (free for max 125,000 calls p/yr) (more info here)

. Application type: What is the application? App, site,for non-profit use, etc.

In this scenario, the admin fills it out;

  • Application name: Sales Office
  • Application URL: https://yourcompany.sharepoint.com
  • Key type: Basic
  • Application Type: Public Website

After you click Create and the Captcha was filled out correctly, the page refreshes and displays your new key below. You'll receive a 64 character key.

bingmapsdev4
bingmapsdev4

Step 2. Gather required information

For the script to run properly, you'll need the following information;

. Site URL: URL to the site where the list is.

. Login account: at least admin permission as you're changing list settings.

. List Name: name of the list to add the geolocation column type.

. Column Name: title of the geolocation column.

. Bing Maps Key: to register the app and remove the notification in map view.

In this example, the admin has gathered the following info;

  • Site URL: https://yourcompany.sharepoint.com/sites/sales
  • Creds: Admin@YourCompany.onmicrosoft.com
  • List Name: Contact
  • Column Name: Office
  • Bing Maps Key: [PASTE KEY HERE]

Now we can run the script.

Step 3. Run script

Start SharePoint Online Management Shell as administrator

If you don't have SharePoint Online Management Shell, you can download it @ Microsoft Download Center

set-executionpolicy Unrestricted Clear-Host [void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') <# Get User input #> $SiteURL = [Microsoft.VisualBasic.Interaction]::InputBox("Enter Site URL, example: https://yourtenant.sharepoint.com/sites/yoursite", "URL", "") $Login = [Microsoft.VisualBasic.Interaction]::InputBox("Office 365 Username, example: youradmin@yourtenant.onmicrosoft.com", "Username", "") $ListName = [Microsoft.VisualBasic.Interaction]::InputBox("List name to add Geolocation column", "ListName", "") $ColumnName = [Microsoft.VisualBasic.Interaction]::InputBox("Column name for the Geolocation column", "ColumnName", "") $BingMapsKey = [Microsoft.VisualBasic.Interaction]::InputBox("Bing Maps key", "Key", "") <# Show results #> Write-Host "/// Values entered for use in script ///" -foregroundcolor magenta Write-Host "Site: " -foregroundcolor white -nonewline; Write-Host $SiteURL -foregroundcolor green Write-Host "Useraccount: " -foregroundcolor white -nonewline; Write-Host $Login -foregroundcolor green Write-Host "List name: " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green Write-Host "Geolocation column name: " -foregroundcolor white -nonewline; Write-Host $ColumnName -foregroundcolor green Write-Host "Bing Maps key: " -foregroundcolor white -nonewline; Write-Host $BingMapsKey -foregroundcolor green Write-Host " " <# Confirm before proceed #> Write-Host "Are these values correct? (Y/N) " -foregroundcolor yellow -nonewline; $confirmation = Read-Host if ($confirmation -eq 'y') { $WebUrl = $SiteURL $EmailAddress = $Login $Context = New-Object Microsoft.SharePoint.Client.ClientContext($WebUrl) $Credentials = Get-Credential -UserName $EmailAddress -Message "Please enter your Office 365 Password" $Context.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($EmailAddress,$Credentials.Password) $List = $Context.Web.Lists.GetByTitle("$ListName") $FieldXml = "<Field Type='Geolocation' DisplayName='$ColumnName'/>" $Option=[Microsoft.SharePoint.Client.AddFieldOptions]::AddFieldToDefaultView $List.Fields.AddFieldAsXml($fieldxml,$true,$option) $Context.Load($list) $Context.ExecuteQuery() $web = $Context.Web $web.AllProperties["BING_MAPS_KEY"] = $BingMapsKey $web.Update() $Context.ExecuteQuery() $Context.Dispose() Write-Host " " Write-Host "Done!" -foregroundcolor green Write-Host " " } else { Write-Host " " Write-Host "Script cancelled" -foregroundcolor red Write-Host " " }
The actual programming part in the script I modified from the script posted ina blog post by Albert Hoitingh. I wanted to remove the hardcoded values from the code, so the script can be run based on user inputs. So I added the interface (input boxes, confirmation, write-hosts), replaced the hard coded values and added comments.

When you run the script, PowerShell will ask the user to input the information we gathered in Step 2.

geoscript1
geoscript1
geoscript2
geoscript2
geoscript3
geoscript3
geoscript4
geoscript4
geoscript5
geoscript5

After the last values have been entered, the admin will see a confirmation screen where the values can be review and confirmed (if the input is incorrect, the script can be cancelled by entering "N" to not proceed - screenshot).

geoscript6
geoscript6

After confirmation, the admin will be prompted to enter the password.

geoscript7
geoscript7

If everything was filled out correctly, the script will run and returns with the "Done!" notification upon completion.

geoscript8
geoscript8

Return to your SharePoint online list and you'll notice when creating a new view for your list "Contact", you gained the option Map View. When checking the list settings, the column "Office" has been added of the type Geolocation.

mapview1
mapview1

Are you missing information, do you want me to clarify anything, do you want to post a conversation starter or do you just want to say thanks? Leave a comment.

Convert security groups to mail-enabled and universal for Office 365 with PowerShell

by Thomas Verwer, Technical Consultant @ Rapid Circle When carrying out projects for Enterprise clients I commonly face challenges with companies not meeting the system requirements for Office 365. One of the most commonly seen missing requirements are on the Identity and Access Management part of Office 365.

When migrating legacy Identity and Acces Management infrastructures to Office 365 you quickly bump in to Microsoft’s Active Directory Services (ADS). To migrate this service to Windows Azure Active Directory – which is part of every Office 365 license – you can use the Windows Azure Active Directory Sync tool. Or as most IT Professionals know it “DirSync”, this is a special edition in the history of FIM.

Now back to businness. To migrate legacy security groups to Windows Azure Active Directory, for products such as Exchange Online it is a requirement to have a GroupScope of Universal.(see image below)

Get-ADGroup-GroupScope
Get-ADGroup-GroupScope

Since most companies still use Global security groups these need to be converted. Therefore I use a PowerShell script which automates this proces. For this script to work, import the ActiveDirectory module in PowerShell or run the script with Active Directory Module for Windows PowerShell.

Clear-Host

if((Get-Module | where {$_.Name -eq “ActiveDirectory”}) -eq $null){Import-Module ActiveDirectory}$scriptPath = split-path -parent $MyInvocation.MyCommand.DefinitionSet-Location $scriptPathWrite-Output “Output will be stored in ” (Get-Location)

$SeaBase = “DC=corp,DC=local”$SeaVal = “CN=Mailbox_*”$SeaScope = “Subtree”$GrpList = “ADSecGrp.csv”$UniGrpList = “Uni_ADSecGrp.csv”$strLogFile = “ErrorLog.txt”$DomainAdmin = Get-Credential

#Search for all Groups that are of type Security and scope is Global and starts with “Mailbox_”$SecGrps = Get-ADGroup -SearchScope $SeaScope -SearchBase $SeaBase -Filter {GroupCategory -eq “Security” -and GroupScope -eq “Global”}

foreach ($secGrp in $SecGrps) {try {$DN = $secGrp | Where-Object {$_.DistinguishedName -like $SeaVal}$DN | Export-Csv $GrpList -Append } catch {throwBreak}}

(Get-Content $GrpList | Select-Object -Skip 1) | Set-Content $GrpList

Write-Output “Check $GrpList to verify all exported security Groups are of type Global”Write-Output “Press Y to continue”$selection = read-hostif ($selection -eq “y” -or $selection -eq “Y”){Write-Output “$GrpList CSV File Checked….”foreach($G in Import-Csv $GrpList){try {$D = $G.DistinguishedNameGet-ADGroup -Identity $G.SID Set-ADGroup -Identity $G.SID -GroupScope Universal -Credential $DomainAdmin} catch {$ErrorMessage = $_.Exception.MessageWrite-Output “Error converting for $D ..`n Error Message : $ErrorMessage” | Add-Content $strLogFileThrowBreak}$DN = Get-ADGroup -Identity $G.SID$DN | Export-Csv $UniGrpList -Append}(Get-Content $UniGrpList | Select-Object -Skip 1) | Set-Content $UniGrpListWrite-Output “Check $UniGrpList to verify all modified security Groups are of type Universal”}else{Write-Output “Script Stopped by User” | Add-Content $strLogFileBreak}

As you can see the script contains several variables. With these you can define the scope of OU’s or name convention for existing security groups. When running the PowerShell script it builds up a CSV-file called Uni_ADSecGrp.csv. When paused you can open and check the file to see if it contains the groups which you wish to convert. If so, you can insert “Y” to the script and it proceeds running.

After we have succesfully changed the GroupScopes to Universal we can carry on and use the second PowerShell script which mail-enables the security groups so they meet the requirements for Exchange Online. Besides the conversion to mail-enabled it also hides the groups from the Global Address List.

Run this script on one of the legacy Exchange servers with the use of the  Exchange Management Shell.

Clear-Host

#if((Get-Module | where {$_.Name -eq “ActiveDirectory”}) -eq $null){# Import-Module ActiveDirectory#}

#Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010# $env:ExchangeInstallPathbinRemoteExchange.ps1#Connect-ExchangeServer -auto

Write-Output “Output will be stored in ” (Get-Location)

$GrpList = “Final_ADSecGrp.csv”$strLogFile = “enableErrorLog.txt”$log = “AfterLog.txt”$ErrorLog = “ErrorLog.txt”

foreach($G in Import-Csv “Uni_ADSecGrp.csv”){

try {Get-ADGroup -Identity $G.SID

Enable-DistributionGroup -Identity $G.DistinguishedName -Alias $G.NameSet-DistributionGroup -Identity $G.DistinguishedName -HiddenFromAddressListsEnabled $trueGet-DistributionGroup -Identity $G.DistinguishedName | Add-Content $Log$x = Get-DistributionGroup -Identity $G.DistinguishedNameif($x -ne $Null){Write-Output $G.DistinguishedName}else{Write-Output $G.DistinguishedName | Add-Content $ErrorLog}} catch {$ErrorMessage = $_.Exception.MessageWrite-Output “Error Enabl-DistributionGroup for $G.DistinguishedName …..`nError Message : $ErrorMessage” | Add-Content $strLogFilethrowBreak}

}

Once you have succesfully executed the second script you can add these objects to your Windows Azure Directory Sync cycle. Please be aware that when you convert the groups, the groups may not contain unsupported characters such as namespaces or & characters.

Best of luck to you all with carrying out succesfull Office 365 deployments. Before I publish the post, I have to share my respect to my colleague and teammate Dev Chaudhari for working on the scripting!

Original blog on: thomasverwer.com

Create a sub site based on a custom (Sandboxed) Web Template using Powershell

One quick tip when using Powershell to create a SharePoint sitestructure combined with custom Web Templates (In my case Sandboxed). The correct syntax for creating Webs based on your custom (sandboxed) Web Template is: $site = Get-SPSite http://myserver/sites/mysite  $web = New-SPWeb http://myserver/sites/mysite/subsite   $web.ApplyWebTemplate(“{FeatureGUID}#MyTemplate”)

In this case you can not use the –template parameter on the New-SPWeb method. Hopefully this post will save u the wondering why your PS script is failing.

If you are trying to figure out what the correct name is for the WebTemplate, you can also use this PS command to check which Templates are available in your current Site.

$site = Get-SPSite http://myserver/sites/mysite  $site.GetWebTemplates(lcid)

lcid off course being the language ID.

Inconvenient Recurrence Data from a SharePoint Calendar using Client OM (JS)

The CAML to get SharePoint items from lists has a nice feature when it comes to Calendar data. Using the <DateRangesOverlap/> it is possible to check if a given date is within a range of dates and when passing the ExpandRecurrence property it even takes the Recurrence Data into account. Using Sandboxed Solutions to do so we can easily set this property SPQuery.ExpandRecurrence = true; However when it comes to the JavaScript Client OM we cannot use this property.

WebServices

The only way to query SharePoint for Recurrence Data using Client OM is by using theGetListItems Method from the List WebService (/_vti_bin/lists.asmx). The XML passed into the WebService can handle QueryOptions and we can use this to specify if ExpandRecurrence should be true. The Soap XML looks like this:

<soapenv:Envelopexmlns: soapenv=’http://schemas.xmlsoap.org/soap/envelope/’>    <soapenv:Body>     <GetListItemsxmlns =’http://schemas.microsoft.com/sharepoint/soap/’>        <listName>[MyListName]</listName>        <query>          <Query>            <Where>              <DateRangesOverlap>                <FieldRefName=’EventDate’ />                <FieldRefName=’EndDate’ />                <FieldRefName=’RecurrenceID’ />                <ValueType=’DateTime’>                  <Today />                </Value>              </DateRangesOverlap>            </Where>          </Query>        </query>        <queryOptions>          <QueryOptions>            <ExpandRecurrence>TRUE</ExpandRecurrence>            <CalendarDate>              <Today />            </CalendarDate>            <ViewAttributesScope=’RecursiveAll’ />          </QueryOptions>        </queryOptions>        <viewFields>          <ViewFields>            <FieldRefName=’EventDate’ />            <FieldRefName=’EndDate’ />            <FieldRefName=’fAllDayEvent’ />            <FieldRefName=’fRecurrence’ />            <FieldRefName=’Title’ />          </ViewFields>        </viewFields>        <RowLimit>10</RowLimit>      </GetListItems>    </soapenv:Body>  </soapenv:Envelope>

This example checks if there are events overlapping with <Today />. I found that it’s best to check with tags like <Today/>. <Week/>, <Month/> and <Year/> instead of using UTC Dates. Notice the <queryOptions/> wrapper passing the Options. Building regular CAML XML using the inner <QueryOptions/> tag didn’t work unfortunately. Both <ExpandRecurrence/> as <CalendarDate/> are mandatory in order for this query to work.

Tip: When using Client OM to process SharePoint data requests it’s best practice to specify as much properties as possible. Specifying <ViewFields/> will only return these fields instead of all possible fields that come with an Item. If you know the maximum amount of results specifying <RowLimit/> also helps to improve the performance on your Query.

Create publishing pages with PowerShell in SharePoint 2010

For a client we created an extended PowerShell script to do an One-Click deployment of their intranet. This was awesome to create because it really saved huge amounts of time creating sites & site collections, adding and activating solutions and setting permissions, navigation and master pages. But we still had to add a piece of code to create pages in the Pages Library. After a quick search, I came across this blogpost of Brendan Newell. It helped me out greatly! It just missed 1 crucial element; setting the page title. So I added this to his script. They were minor changes, so all credits go to Brendan.

PowerShell Script; # Read in list of pages from XML [xml]$pagesXML = Get-Content “pages.xml" if ($pagesXML -eq $null) { return } # Get publishing web $site = New-Object Microsoft.SharePoint.SPSite($server1) $web = $site.rootweb $pWeb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web) # Loop through each page node to extract filename and titlename $pagesXML.Pages.Page | ForEach-Object { $fileName = [string]$_.FileName Write-Host “Creating $fileName” $titleName = [string]$_.TitleName Write-Host “Creating $titleName” # Create blank page $newPage = $pWeb.AddPublishingPage() $newPage.Update() # Update the filename to the one specified in the XML $newPage.ListItem["BaseName"] = $fileName $newPage.ListItem["Title"] = $titleName $newPage.ListItem.SystemUpdate() # Check-in and publish page $newPage.CheckIn(“”) $newPage.ListItem.File.Publish(“”); } # Dispose of the web $web.Dispose() XML Structure; <?xml version="1.0" encoding="utf-8"?> <Pages> <Page> <FileName>P1</FileName> <TitleName>Page One</TitleName> </Page> <Page> <FileName>P2</FileName> <TitleName>Page Two</TitleName> </Page> </Pages>