Wednesday, October 18, 2017

Change DocumentID prefix through PowerShell script

Four and a half years ago, I submitted a post about how to change DocumentID prefix manually for a single document.

Eventually I realised it's convenient to use site collection path name as the DocumentID prefix. However, if users want to change the site collection name, then we have to refresh the DocumentID for all documents.

Here is about how to do that through PowerShell for multiple site collections.

$ver = $host | select version
if ($ver.Version.Major -gt 1)  {$Host.Runspace.ThreadOptions = "ReuseThread"}
Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
Add-PSSnapin Microsoft.Office.DocumentManagement -ErrorAction SilentlyContinue

Set-StrictMode -Version Latest

# Write-Log -Message 'Log message'
# Write-Log -Message 'Restarting Server.'
# Write-Log -Message 'Folder does not exist.' -Level Error
$Global:LogFile = "E:\DailyBackup\Log\ResetDocumentID." + (Get-Date).ToString("yyyyMMdd-HHmmss") + ".txt"

function Write-Log{
        [Parameter(Mandatory=$true, ValueFromPipelineByPropertyName=$true)]


        $VerbosePreference = 'Continue'
        #if (!(Test-Path $LogFile)) {
        #    Write-Verbose "Creating $LogFile."
        #    $NewLogFile = New-Item $LogFile -Force -ItemType File

        $FormattedDate = Get-Date -Format "yyyy-MM-dd HH:mm:ss"

        switch ($Level) {
            'Error' {
                $LevelText = 'ERROR:'
                $MessageColor = [System.ConsoleColor]::Red
            'Warn' {
                $LevelText = 'WARNING:'
                $MessageColor = [System.ConsoleColor]::Yellow
            'Info' {
                $LevelText = 'INFO:'
                $MessageColor = [System.ConsoleColor]::DarkGreen
            'HighLight' {
                $LevelText = 'HIGHLIGHT:'
                $MessageColor = [System.ConsoleColor]::Green
        Write-Host $Message -f $MessageColor

        $MessageContent = "$FormattedDate $LevelText $Message"
        $MessageContent | Out-File -FilePath $Global:LogFile -Append
        #$opts = @{ForegroundColor=$MessageColor; BackgroundColor="black"; object=$MessageContent}
        #Write-Log $opts

function GetWebAppUrlFromSiteUrl([string]$SiteUrl){
#Write-Log -Message "GetWebAppUrlFromSiteUrl(), start......SiteUrl=$SiteUrl" -Level HighLight
    $site = Get-SPSite -Identity $SiteUrl
    $WebAppUrl = $site.WebApplication.GetResponseUri([Microsoft.SharePoint.Administration.SPUrlZone]::Default).AbsoluteUri
    if ($WebAppUrl.EndsWith("/","CurrentCultureIgnoreCase")){
        $WebAppUrl = $WebAppUrl.Substring(0, $WebAppUrl.Length - 1)

#Write-Log -Message "GetWebAppUrlFromSiteUrl(), complete. WebAppUrl=$WebAppUrl" -Level HighLight
    return $WebAppUrl

function GetSiteNameFromSiteUrl([string]$SiteUrl){
# Write-Log -Message "GetSiteNameFromSiteUrl(), start......SiteUrl=$SiteUrl"
    if ($SiteUrl.EndsWith("/","CurrentCultureIgnoreCase")){
        $SiteUrl = $SiteUrl.Substring(0, $SiteUrl.Length - 1)
$iPos = $SiteUrl.LastIndexOf('/')
$SiteUrl = $SiteUrl.Substring($iPos + 1)

# Write-Log -Message "GetSiteNameFromSiteUrl(), complete. SiteUrl=$SiteUrl"
    return $SiteUrl

function StartTimerJob([string]$WebAppUrl, [string]$JobName){
Write-Log -Message "StartTimerJob(), start......WebAppUrl=$WebAppUrl, JobName=$JobName"
$job = Get-SPTimerJob -WebApplication $WebAppUrl $JobName
if (!$job){
Write-Log -Message "StartTimerJob(), No valid timer job found, WebAppUrl=$WebAppUrl, JobName=$JobName" -Level Error
$startTime = $job.LastRunTime

Start-SPTimerJob $job
while (($startTime) -eq $job.LastRunTime)
Write-Host -NoNewLine "."
Start-Sleep -Seconds 2

Write-Log "Timer Job '$JobName' has completed on $WebAppUrl."

# Write-Log -Message "StartTimerJob(), complete. SiteUrl=$SiteUrl"

function ResetDocumentID([string]$startSPSiteUrl){
    Write-Log -Message "ResetDocumentID(), startSPSiteUrl=$startSPSiteUrl"
    $SiteUrlPrevious = ""
    $SiteUrl = ""
    $WebAppUrl = ""
    $WebAppUrlPrevious = ""

$rootweb = $null
    $SiteCount = 0
    $i = 0

$sites = @(Get-SPSite -Limit ALL | ?{$_.ServerRelativeUrl -notmatch "Office_Viewing_Service_Cache" `
-and $_.Url.Startswith($startSPSiteUrl, "CurrentCultureIgnoreCase") `
-and $_.Url -notmatch "SearchCenter" `
-and $_.Url -notmatch "IPForm " `
-and $_.Url -notmatch "SPTest" `
-and $_.Url -notmatch "mysite"})
$SiteCount = $sites.count
if ($SiteCount -eq 0){
Write-Log -Message "No valid SPSite found, startSPSiteUrl=$startSPSiteUrl" -Level Error
Write-Log -Message "sites.count=$SiteCount"

$progressBarTitle = "ResetDocumentID(), Scan SPSites, SiteCount=$SiteCount, startSPSiteUrl=$startSPSiteUrl"
foreach ($site in $sites){
Write-Progress -Activity $progressBarTitle -PercentComplete (($i/$SiteCount)*100) -Status "Working"

$SiteUrl = $site.Url
$WebApplicationUrl =

Write-Log "ResetDocumentID(), SiteUrl=$SiteUrl"
if ($site.ReadOnly){
Write-Log "ResetDocumentID(), Site($SiteUrl) is read-only. Skip." -Level Warn

$WebAppUrl = GetWebAppUrlFromSiteUrl $SiteUrl
if ($WebAppUrl.EndsWith(".local","CurrentCultureIgnoreCase") -eq $false){
Write-Log -Message "ResetDocumentID(), skip web application: WebAppUrl=$WebAppUrl"

$SiteName = GetSiteNameFromSiteUrl $SiteUrl
Write-Log "ResetDocumentID(), DocumentID=$SiteName"

[Microsoft.Office.DocumentManagement.DocumentID]::EnableAssignment($site,$false)   #First disable, then enable DocID assignment
$["docid_msft_hier_siteprefix"]= $SiteName  # This is the property holding the Document ID Prefix which we use to ensure uniqueness
[Microsoft.Office.DocumentManagement.DocumentID]::EnableAssignment($site,$true,$true,$true)  # now we can force all Document IDs to be reissued
Catch [system.exception]{
$strTmp = [string]::Format("ResetDocumentID(), startSPSiteUrl={0}, SiteUrl={1}, ex.Message={2}", $startSPSiteUrl, $SiteUrl, $Error[0].Exception.Message)
Write-Log $strTmp -Level Error
Write-Log $_.Exception -Level Error
if ($rootweb){
if ($site){
if ([string]::IsNullOrEmpty($SiteUrlPrevious)){
$SiteUrlPrevious = $SiteUrl
$WebAppUrlPrevious = $WebAppUrl
if ($WebAppUrl.Equals($WebAppUrlPrevious, [StringComparison]::InvariantCultureIgnoreCase) -eq $false){
StartTimerJob $WebAppUrl "DocIdEnable"
StartTimerJob $WebAppUrl "DocIdAssignment"

$WebAppUrlPrevious = $WebAppUrl

Write-Log -Message "ResetDocumentID(), completed"

StartTimerJob $WebAppUrl "DocIdEnable"
StartTimerJob $WebAppUrl "DocIdAssignment"


# $_SiteNameSuffix = '2016DEV'
# $_SiteNameSuffix = '2013DEV'
$_SiteNameSuffix = ''

# $_SiteUrl = ""
$_SiteUrl = "http://team$_SiteNameSuffix.SharePointServer.local/sites/SiteCollectionName"

ResetDocumentID $_SiteUrl

Write-Log -Message "Finished! Press enter key to exit."

Tuesday, October 17, 2017

Shocking change of Office 365 and AAD licensing

Based on this post , for a company with 1000 users, to prevent users from creating Groups, Microsoft will charge AUD 91700 ( around USD 72000) per year!

So, is this the extra cost of "cloud platform"? How many of this kind of licensing changes are there waiting for us?!

I am speechless now.  :-(


Azure Active Directory pricing

The Price of Office 365 Groups

"I honestly cannot come up with a justification for charging extra for the ability to prevent Groups from being created by every user in your organization."

Monday, October 16, 2017

Some thoughts about Microsoft FLOW

After watching Deep dive: Advanced workflow automation with Microsoft Flow, I have to admit that FLOW is much more powerful than I thought. It can replace SharePoint workflows in most of the cases!

However, as it is designed for power users, I smelled something bad.

1. Now I start to understand that why "everyone needs to learn coding". Simple coding(or drag and drop style software development) allows users to do much more work efficiently.

2. We will get millions of worst "software programmers" to build billions of FLOW modules. These FLOW modules may run very slow, may consume a lot of hardware resource, and almost no one can maintain them. Because these FLOW modules are running in Azure, clients need to pay much higher fee than normal. (Microsoft will be very happy about that, and we cannot blame Microsoft)

3. No user would write document for the FLOW functionalities they build.

4. Fix/improve those FLOW modules is not easy, and troubleshooting on those modules would be nightmare.

5. Who is going to test the FLOW modules built by users? A module may accidentally delete a lot of data (which may not be able of recovering), or send out thousands of emails.

6. For most of the FLOW functionalities, if a developer can do it through C# in one day, he/she may need 3 days to do it through "drag and drop". And it's pretty hard to maintain those functionalities. It would take much more time to make minor changes.

7. Not sure how many security issues it will cause if we allow users to build their own FLOW modules.

8. If Microsoft decides to change/upgrade/obsolete some API/function, who is going to upgrade existing customized FLOW modules?

Conclusion: FLOW is too powerful, so it is not for power users but for developers. Normal power users can use it, but only for very simple functionality, especially when some system other than SharePoint is involved. Only in those cases, FLOW is useful to power users and can improve productivity.

Friday, September 29, 2017

Hashtable, export to and import from CSV file

There are built-in functions, "Import-Csv" and "Export-Csv", to handle this request. But it's not as simple as it looks like.

Let's run some test script first.

$csvFile = "E:\test.csv"
$HashTable1 = @{}
$HashTable1.Add("aa", "11")
$HashTable1.Add("bb", "22")
($HashTable1).GetEnumerator() | Sort-Object -Property Value `
| %{new-object psobject -Property @{Value=$_.Value;Name = $_.Name}} `
| Export-Csv $csvFile -NoTypeInformation

$HashTable2 = @{}

$HashTable2 = Import-Csv $csvFile

We can see that we didn't get "HashTable" from "Import-Csv" command. Instead, we got "Array" object.

$HashTable2 = @{}
$Array = Import-Csv $csvFile
$Array | %{$HashTable2.Add($_.Name,$_.Value)}

That's how we get the HashTable back.

Tuesday, September 19, 2017

PowerShell Runbook to auto start and shut down Azure VM in a resource management group

Last post listed the sample code to start and shut down Azure VM remotely.

Here is the PowerShell runbook script which can be scheduled in Azure.

This is more stable, simpler and easier to manage.

$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID `
  -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint

Write-Output "Connection established."

$vmname = 'AZ532-test1'
$VMDetail = Get-AzureRMVM -ResourceGroupName $ResourceGroupName -Name $VmName -Status
$vmPowerstate = $VMDetail[1].Code
Write-Verbose "vmPowerstate: $vmPowerstate"

if ($vmPowerstate -like "PowerState/running"){
write-host "VM '$vmname' is ""$vmPowerstate"". Skip."
write-host "Starting VM '$vmname'"
Start-AzureRMVM -ResourceGroupName $ResourceGroupName -Name $VmName -Verbose

if ($vmPowerstate -like "PowerState/running"){
write-host "Stopping VM '$vmname'"
Stop-AzureRMVM -ResourceGroupName $ResourceGroupName -Name $VmName -Verbose -Force
# }
# else{
write-host "VM '$vmname' is ""$vmPowerstate"". Skip."
# }

Write-Output "VM $vmname is started."


Azure classic VM is similar:

$ConnectionAssetName = "AzureClassicRunAsConnection"
$connection = Get-AutomationConnection -Name $connectionAssetName        
Write-Verbose "Get connection asset: $ConnectionAssetName" -Verbose
$Conn = Get-AutomationConnection -Name $ConnectionAssetName
if ($Conn -eq $null)
    throw "Could not retrieve connection asset: $ConnectionAssetName. Assure            that this asset exists in the Automation account."

Write-Output "Connection established."

$CertificateAssetName = $Conn.CertificateAssetName
Write-Verbose "Getting the certificate: $CertificateAssetName" -Verbose
$AzureCert = Get-AutomationCertificate -Name $CertificateAssetName
if ($AzureCert -eq $null)
    throw "Could not retrieve certificate asset: $CertificateAssetName.       Assure that this asset exists in the Automation account."

Write-Verbose "Authenticating to Azure with certificate." -Verbose
Set-AzureSubscription -SubscriptionName $Conn.SubscriptionName -SubscriptionId $Conn.SubscriptionID -Certificate $AzureCert 
Select-AzureSubscription -SubscriptionId $Conn.SubscriptionID

$vmname = 'hvEF4'
$vm = Get-AzureVM | Where-Object { $_.Name -eq $vmname }
write-host "AzureVM: "
$vm | fl *

# if ($vm.PowerState -eq "Started"){
# write-host "Stopping VM '$vmname'"
# $vm | Stop-AzureVM -Force
# }
# else{
# write-host "VM '$vmname' is ""$($vm.PowerState)"". Skip."
# }

if ($vm.PowerState -eq "Started"){
write-host "VM '$vmname' is ""$($vm.PowerState)"". Skip."
write-host "Starting VM '$vmname'"
$vm | Start-AzureVM

write-host "done."

Thursday, September 14, 2017

PowerShell script sample code to auto start and shut down Azure VM in a resource management group

There are quite a lot of changes in RMVM access API in the last two years. Here is some sample code which works well at the moment (2017-09-14).

Hopefully they can save you some time.

1. Install AzureRM module

script: Install-Module -Name AzureRM

2. Check PowerShell version and AzureRM module version

script: $PSVersionTable.PSVersion

Major  Minor  Build  Revision
-----  -----  -----  --------
5      1      14393  1715

script: (get-module azurerm).Version

Major  Minor  Build  Revision
-----  -----  -----  --------
4      3      1      -1

3. Import AzureRM module

script: Import-Module AzureRM

4. Log in without prompt window

First we need to create a file to store the context information.


$Global:_ContextFilePath = "c:\"
Save-AzureRmContext -Path $Global:_ContextFilePath -Force 

Then we can import the context file to avoid input user name and password manually.

Import-AzureRmContext -Path $Global:_ContextFilePath

5. Get the RM VM

$vmname = 'vm name'
$VMDetail = Get-AzureRmVM -ResourceGroupName $ResourceGroupName -Name $VmName -Status | Select-Object -ExpandProperty StatusesText | convertfrom-json
$vmPowerstate = $VMDetail[1].Code

6. Start VM

if ($vmPowerstate -like "PowerState/running"){
write-host "VM '$vmname' is ""$vmPowerstate"". Skip."
write-host "Starting VM '$vmname'"
Start-AzureRMVM -ResourceGroupName $ResourceGroupName -Name $VmName -Verbose

7. Or, shut it down (deallocate it)

if ($vm.PowerState -like "PowerState/running"){
write-host "Stopping VM '$vmname'"
Stop-AzureRMVM -ResourceGroupName $ResourceGroupName -Name $VmName -Verbose -Force
write-host "VM '$vmname' is ""$vmPowerstate"". Skip."


PS: I scheduled the script in windows task scheduler to shut down all dev VMs in the evening. That can save a lot in case I forgot to shut them down manually.

PS 2:

Below is the script to start or stop classic Azure VM.

Import-Module "C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Azure.psd1"

# Get-AzurePublishSettingsFile

$publishsettings = 'e:\EricFang\Visual Studio Ultimate with MSDN-9-16-2016-credentials.publishsettings'

write-host "AzureSubscription: "
Import-AzurePublishSettingsFile $publishsettings

Select-AzureSubscription -SubscriptionId "YOUR SUBSCRIPTION GUID STRING"

$vmname = 'hvEF4'
$vm = Get-AzureVM | Where-Object { $_.Name -eq $vmname }
write-host "AzureVM: "
$vm | fl *

if ($vm.PowerState -eq "Started"){
write-host "VM '$vmname' is ""$($vm.PowerState)"". Skip."
write-host "Starting VM '$vmname'"
$vm | Start-AzureVM

# $vm | Stop-AzureVM -Force

write-host "done."

Tuesday, July 4, 2017

Confusion and complains about Office 365

I am learning Office 365. There are many brilliant ideas in this platform, but also something annoying to me.

Below is my thoughts so far.  It will be keep updated.


1. Microsoft Teams really should be part of "Skype for Business", instead of a separate product.

[update 20170919] Two weeks ago, Microsoft announced that "Skype for Business" is emerged into Microsoft Teams. Great!

2. Documents should not be allowed to be copied from SharePoint Sites to "Teams Files".

It's much better to just create a file link in Microsoft Teams.

3. Outlook Group should be renamed as "Shared Inbox".

The details are described in this post

4. Yammer should be part of SharePoint site, instead of a separate product.

5. All documents should be forced to store in SharePoint Document Library.

In other tools (apps), the documents should be attached at a "file link".

[update, 20170722]

6. Unified API

For example, developers should be able to connect to different APPS by one API command.

[update, 20170726]

7. Add "Fast List" to SharePoint Online

"Fast List" needs to be as fast as SQL table, so all Office 365 items, including conversations, calendar events, etc. can all be stored in SharePoint Online.

[update, 20170731]

8. SPFX solution need "site collection" level deployment scope

"For SharePoint Framework solutions, there is only one deployment scope: tenant wide. Once added, and enabled, in the App Catalog, the SharePoint Framework solution will be available to all Site Collections."