Tuesday, October 17, 2017

One way to avoid “Term update failed because of save conflict” error when create managed metadata terms in Sharepoint

In one of the project we used the following PowerShell CSOM code for creating managed metadata navigation terms in Sharepoint Online:

   1: $newTerm = $navTermSet.CreateTerm($web.Title,
   2:     [Microsoft.SharePoint.Client.Publishing.Navigation.NavigationLinkType]::SimpleLink,
   3:     [System.Guid]::NewGuid())
   4: $newTerm.SimpleLinkUrl = $web.ServerRelativeUrl
   5: $termStore.CommitAll()
   6: $ctx.ExecuteQuery()

It worked successfully for many tenants but for one tenant it gave the following error:

Exception calling "ExecuteQuery" with "0" argument(s): "Term update failed because of save conflict."

Error appeared randomly, i.e. for the same sub site some time term has been created successfully and some time it failed with above error. There were no other changes so the reason was not in pending changes.

In order to avoid this error we applied the following workaround:

   1: do
   2: {
   3:     Try
   4:     {
   5:         $newTerm = $navTermSet.CreateTerm($web.Title,
   6:             [Microsoft.SharePoint.Client.Publishing.Navigation.NavigationLinkType]::SimpleLink,
   7:             [System.Guid]::NewGuid())
   8:         $newTerm.SimpleLinkUrl = $web.ServerRelativeUrl
   9:         $termStore.CommitAll()
  10:         $ctx.ExecuteQuery()
  11:         break
  12:     }
  13:     Catch
  14:     {
  15:         Write-Host "Error occured, try one more time" -foregroundcolor yellow
  16:     }
  17: } while ($true)

I.e. instead of single call to CreateTerm and ExecuteQuery we call it in the loop until call will be successful. Log showed that with this approach all navigation terms have been created properly at the end although for some sub sites it failed 1 time, for other 2 and for some even 3 times until it created term successfully, while for most of sub sites there were no errors at all. Hope that this engineering approach will help some one:).

Wednesday, September 27, 2017

Camlex has been moved to Github

Some time ago MS announced Codeplex shutdown in 2017. I personally liked Codeplex – all these years which it was used for hosting Camlex project I was quite satisfied with its services and functionality. But time goes on and MS made this decision which we have to live with. Regardless of Codeplex shutdown Camlex development will be continued and I’m glad to announce that it was migrated to Github. So the new project home is https://github.com/sadomovalex/camlex. Earlier when project was hosted on Codeplex there were 3 choices how to get ready for use .Net assembly:

  • download it from Codeplex
  • install it directly in VS from Nuget (Camlex.NET.dll package for basic server object model and Camlex.Client.dll for client object model)
  • get latest source code and compile project in VS

After migration to Github Nuget will be primary way of getting binaries and of course it will be still possible to get source code and compile it in VS (project will be remain open source with the same Ms-Pl license). Issues and discussions were migrated together with source code and documentation (discussions were migrated as closed issues to Github with “Discussion: ” prefix: Migrate issues and discussions from Codeplex to Github). So let’s continue Camlex journey with the new home and will add more value to it already on Github.

Tuesday, September 26, 2017

Migrate issues and discussions from Codeplex to Github

As you probably know Codeplex will be shut down soon. I used Codeplex many years for hosting Camlex project – open source library for creating dynamic CAML queries for Sharepoint by C# lambda expressions. Migration guide available on Codeplex says how to move source code to the Github, but unfortunately it doesn’t mention how to move issues and discussions. In this post I will share my experience of how to migrate issues and discussions from Codeplex to Github.

For migrating issues I used Codeplex-Issues-Importer Python script which worked quite well: it added issues with Codeplex label and closed those issues which were closed on Codeplex. But for discussions it was not so straightforward. First of all in Github there are no such thing as “discussion” as in Codeplex, so I decided to move them to Github issues with “Discussion: [Title]” prefix. In order to perform migration itself I made fork of Codeplex-Issues-Importer and modified it so it started to parse Codeplex discussions instead of issues and then save them into Github as issue. Fork can be found here: https://github.com/sadomovalex/Codeplex-Issues-Importer. Script is not perfect but probably will be enough just for keeping old discussions in migrated project. Result of migration can be checked here: https://github.com/sadomovalex/camlex/issues?page=2&q=is%3Aissue+is%3Aclosed.

Monday, September 4, 2017

Sliding session for Sharepoint 2013 with FBA and persistent cookies

Sliding session allows user to use site without being reauthenticated if last action was done less than configured session lifetime. In Sharepoint 2013 FBA the following parameters of security token service config are used for setting session lifetime:

  • CookieLifetime
  • FormsTokenLifeTime
  • LogonTokenCacheExpirationWindow

They are well described in the following article SharePoint 2013 authentication lifetime settings and I won’t repeat it here. The problem is that when you use persistent cookies (i.e. those which are stored on client’s side) only CookieLifetime are actually used (to be more precise, FormsTokenLifeTime is used for setting initial ValidTo value for session security token). In addition to that sliding sessions doesn’t work by default, i.e. regardless of whether user made actions on the site or not he will be logged out after cookies will be expired. Persistent cookies can be set e.g. if user checked “Remember Me” checkbox on the login page:

   1: private bool AuthenticateFormsUser(Uri context, string username, string pwd,
   2:     bool rememberMe)
   3: {
   4:     if (string.IsNullOrEmpty(username) || string.IsNullOrEmpty(pwd))
   5:     {
   6:         return false;
   7:     }
   9:     try
  10:     {
  11:         var formsAuthOption = SPFormsAuthenticationOption.None;
  12:         var tokenType = SPSessionTokenWriteType.WriteSessionCookie;
  13:         if (rememberMe)
  14:         {
  15:             formsAuthOption = SPFormsAuthenticationOption.PersistentSignInRequest;
  16:             tokenType = SPSessionTokenWriteType.WritePersistentCookie;
  17:         }
  19:         var authProvider = GetAuthProvider(SPContext.Current.Site);
  20:         var securityToken = SPSecurityContext.SecurityTokenForFormsAuthentication(
  21:             context,
  22:             authProvider.MembershipProvider,
  23:             authProvider.RoleProvider,
  24:             username,
  25:             pwd,
  26:             formsAuthOption);
  28:         var fam = SPFederationAuthenticationModule.Current;
  29:         fam.SetPrincipalAndWriteSessionToken(securityToken, tokenType);
  30:         return true;
  31:     }
  32:     catch (Exception)
  33:     {
  34:         return false;
  35:     }
  36: }

Here on lines 12-16 code checks whether rememberMe parameter is true and if yes uses persistent cookies.

So is it possible to have sliding expiration sessions when persistent cookies are used? The answer is yes, but in order to do that we will need custom HTTP module which will renew token on each request:

   1: public class SlidingSessionModule : IHttpModule
   2: {
   3:     public void Init(HttpApplication context)
   4:     {
   5:         FederatedAuthentication.SessionAuthenticationModule.SessionSecurityTokenReceived +=
   6:             SessionAuthenticationModule_SessionSecurityTokenReceived;
   7:     }
   9:     private void SessionAuthenticationModule_SessionSecurityTokenReceived(object sender,
  10:         SessionSecurityTokenReceivedEventArgs e)
  11:     {
  12:         try
  13:         {
  14:             if (e == null)
  15:             {
  16:                 return;
  17:             }
  18:             var sessionToken = e.SessionToken;
  19:             if (sessionToken == null)
  20:             {
  21:                 return;
  22:             }
  23:             if (claimsPrincipal == null)
  24:             {
  25:                 return;
  26:             }
  28:             TimeSpan cookieLifetime = TimeSpan.FromSeconds(0);
  29:             SPSecurity.RunWithElevatedPrivileges(
  30:                 () =>
  31:                     {
  32:                         cookieLifetime = Microsoft.SharePoint.Administration.Claims.
  33:                             SPSecurityTokenServiceManager.Local.CookieLifetime;
  34:                     });
  36:             DateTime utcNow = DateTime.UtcNow;
  37:             DateTime validFrom = utcNow;
  38:             DateTime validTo = utcNow + cookieLifetime;
  39:             var sam = FederatedAuthentication.SessionAuthenticationModule;
  40:             e.SessionToken = sam.CreateSessionSecurityToken(claimsPrincipal,
  41:                 sessionToken.Context, validFrom, validTo, sessionToken.IsPersistent);
  42:             e.ReissueCookie = true;
  43:         }
  44:         catch (Exception x)
  45:         {
  46:             // log
  47:         }
  48:     }
  50:     public void Dispose()
  51:     {
  52:     }
  53: }

In the module we subscribe on SessionAuthenticationModule.SessionSecurityTokenReceived event (lines 5-6) and in event handler we renew token with extended ValidFrom and ValidTo properties (lines 36-42) which are set from CookieLifetime property of security token service config (lines 29-34) so you may continue configure it from PowerShell.

Then we need to install this module by adding dll to the GAC and the following line to the web.config <modules> section:

   1: <modules>
   2:   ..
   3:   <add name="SlidingSessionModule"
   4:     type="SlidingSessionModule.SlidingSessionModule, SlidingSessionModule, Version=, Culture=neutral, PublicKeyToken=..." />
   5: </modules>

After that you will have sliding sessions with persistent cookies for Sharepoint FBA.

Thursday, August 24, 2017

Sharing cookies for HttpWebRequest from Sharepoint site with FBA claims authentication

If you need to make sub request from your Sharepoint site you can do it like this (in this post we will assume that we make sub requests to the same Sharepoint site):

   1: var request = (HttpWebRequest)WebRequest.Create(url);
   2: request.Credentials = CredentialCache.DefaultNetworkCredentials;
   3: var response = (HttpWebResponse)request.GetResponse();

This code will work for Windows authentication – on receiver’s side if you will check SPContext.Current.Web.CurrentUser it will be the same as on sender’s side. But if the same code will run under FBA zone SPContext.Current.Web.CurrentUser will be null on receiver’s side. In order to force Sharepoint to execute the code under the same user also in FBA zone we need to share cookies:

   1: var request = (HttpWebRequest)WebRequest.Create(url);
   2: request.Credentials = CredentialCache.DefaultNetworkCredentials;
   4: if (HttpContext.Current != null && web.Site.Zone != SPUrlZone.Default)
   5: {
   6:     HttpCookie authCookie = HttpContext.Current.Request.Cookies["FedAuth"];
   7:     if (authCookie != null)
   8:     {
   9:         log("Before send request: set auth cookies");
  10:         request.CookieContainer = new CookieContainer();
  11:         request.CookieContainer.Add(new Cookie("FedAuth", authCookie.Value,
  12:             authCookie.Path, new Uri(url).Host));
  13:     }
  14: }
  16: var response = (HttpWebResponse)request.GetResponse();

In this example we assume that site works both with Windows and FBA zones and that Windows authentication is used on Default zone. After that SPContext.Current.Web.CurrentUser will be also correct on receiver’s side for FBA zone.

Unspecified error when create Domain local groups via DirectoryServices programmatically in .Net

In order to create AD group programmatically we can use DirectoryServices .Net assembly. Here is the code which create domain global group:

   1: string groupName = "...";
   2: var de = new DirectoryEntry("LDAP://...");
   3: var group = de.Children.Add("CN=" + groupName, "group");
   4: group.Properties["samAccountName"].Value = groupName;
   5: group.CommitChanges();
   6: return true;

If we want to create Domain local group we need to set one more property for created group: groupType. Value of this property should be created as bitmask from the following values (see ADS_GROUP_TYPE_ENUM enumeration):









But if we use C# and try to create group like this:

   1: string groupName = "...";
   2: bool local = ...;
   3: var de = new DirectoryEntry("LDAP://...");
   4: DirectoryEntry group = de.Children.Add("CN=" + groupName, "group");
   5: group.Properties["samAccountName"].Value = groupName;
   6: group.Properties["groupType"].Value = (local ? 0x00000004 : 0x00000002) | 0x80000000;
   7: group.CommitChanges();

we will get exception with Unspecified error message:

Unspecified error
   at System.DirectoryServices.Interop.UnsafeNativeMethods.IAds.PutEx(Int32 lnControlCode, String bstrName, Object vProp)
   at System.DirectoryServices.PropertyValueCollection.set_Value(Object value)

The reason is that by default C# will cast value (local ? 0x00000004 : 0x00000002) | 0x80000000 as long and will pass 2147483652 to the groupType property which is incorrect value here. In order to avoid this error we need to pass int value to this property, i.e. in our code we should explicitly cast it to int – in this case it will pass negative value -2147483644 there:

   1: string groupName = "...";
   2: bool local = ...;
   3: var de = new DirectoryEntry("LDAP://...");
   4: DirectoryEntry group = de.Children.Add("CN=" + groupName, "group");
   5: group.Properties["samAccountName"].Value = groupName;
   6: group.Properties["groupType"].Value =
   7:     (int)((local ? 0x00000004 : 0x00000002) | 0x80000000);
   8: group.CommitChanges();

and group will be successfully created.

Thursday, August 17, 2017

List all UserCustomActions in Sharepoint site collections and sub sites via PowerShell

User custom actions (see SPSite.UserCustomActions and SPWeb.UserCustomActions) are powerful mechanism to add customizations on Sharepoint site (on-premise or online) via javascript. E.g. in one of the previous posts I showed how to add custom javascript file to all pages in your site collection without modifying master page: see Add custom javascript file to all pages in on-premise Sharepoint site collection without modifying masterpage and Add custom javascript file to all pages in Sharepoint Online site collection. Sometimes we need to perform inventory of all custom actions with script links. Here is the PowerShell script which iterates through all site collections in provided web application and all sub sites and outputs custom action’s ScriptSrc to the log file:

   1: param(
   2:     [string]$url
   3: )
   5: if (-not $url)
   6: {
   7:     Write-Host "Specify web application url in url parameter"
   8: -foregroundcolor red
   9:     return
  10: }
  12: function CheckWeb($web)
  13: {
  14:     Write-Host "Web:" $web.Url
  15:     foreach($ac in $web.UserCustomActions)
  16:     {
  17:         ("  " + $ac.ScriptSrc) | Out-File "log.txt" -Append
  18:     }
  20:     $web.Webs | ForEach-Object { CheckWeb $_ }
  21: }
  23: function CheckSite($site)
  24: {
  25:     Write-Host "Site collection:" $site.Url
  26:     ("Site collection: " +  $site.Url) | Out-File "log.txt" -Append
  28:     foreach($ac in $site.UserCustomActions)
  29:     {
  30:         ("  " + $ac.ScriptSrc) | Out-File "log.txt" -Append
  31:     }
  33:     CheckWeb $site.RootWeb
  35:     ("---------------------------------------") | Out-File "log.txt" -Append
  36: }
  39: $wa = Get-SPWebApplication $url
  40: $wa.Sites | ForEach-Object { CheckSite $_ }

Here in order to write results to the log file I used approach described in the following post: Write output to the file in PowerShell. And it is quite straightforward to rewrite this script for Sharepoint Online (see article provided above for Sharepoint Online). Hope it will help someone.