Sitecore + Microsoft Teams: Accelerating remote collaboration across your digital eco-system 

The global shift to remote working poses a wide variety of challenges, from inevitable delays in content deliveries, communication transparency, to traceability, quality, efficiency, etc. Sitecore + Teams Integration connects teams in the new virtual workspace with its rich instant notifications, actions & valuable insights.

This Integration includes Sitecore Connect+  Microsoft Teams App and Sitecore SPE Module that can be installed from the below links,

This blog details the steps involved in setting up the Microsoft Teams App and Sitecore SPE Module for receiving workflow and publish notifications in Microsoft Teams Desktop/Web/Mobile Apps.

This module is built on top of Sitecore Powershell Extensions (SPE), please make sure SPE is installed before installing this module.


  • Create a Teams Channel in the Microsoft Teams with the intended content authors/administrators
  • Update the Channel Notification Settings if required
  • Navigate to the ‘Apps’ tab of Microsoft Teams
  • Search for ‘Sitecore Connect+’ app and click on ‘Add to a Team’
  • Select the new Teams Channel and click on ‘Set up a Connector’
  • Copy the webhook URL from the Teams Configuration page launched
  • Click on the ‘Save’ button to set up the connector in the Teams Channel


  • Install one of the above mentioned Sitecore packages in your Sitecore Instance depending on your Sitecore version
  • Alternatively, the Docker Asset Image can also be used


  • From the Sitecore Launchpad, select the ‘Microsoft Teams Integration’ tile to launch the Sitecore module configuration page
  • Click on ‘Add New Teams Endpoint’ and enter the webhook URL copied from the Teams  Configuration page
  • Click on the ‘Test’ link to ensure that the connector is set up successfully. A TEST notification will be received in the Teams Channel.
  • Select the Workflow commands for which you would like to receive notifications in Teams from the Workflow tab
  • Select the content locations for which you would like to receive notification in Teams if the item(or its subitem) or at least one of its/subitem’s referrers is published.
  • Click on the ‘Create’ button to complete the notification settings configuration for the Teams Channel


  • Create PageSpeed Insights API Key using the below link and update the same in the Settings tab to receive insights on Performance, Accessibility, SEO, etc.,
  • Update the Sitemap or Sitemap Index Url in the Settings tab to enable ‘Request Google Indexing’ feature in publish notifications
  • Ensure to select your preferred content editors(Eg: Horizon, Content Editor etc.) in the Settings tab to render only the required actions in the notifications
  • Update the Background Image, Button Color, or Text Color in the Appearance tab to match your branding


  • Use profile pictures that are 48×48 dimensions in Sitecore User Manager
  • Ensure TargetHostName & LanguageEmbedding properties are set for your Sitecore sites in site config
  • Create/Update the default notification templates if required from the below path,
    /sitecore/system/Modules/PowerShell/Script Library/Sitecore + Microsoft Teams Integration/Message Templates
  • Due to certain limitations, mobile app might not display certain features
  • Performance Scores might have little variations between each run (as the server response times for individual requests won’t be exactly the same every time)

There are chances that you might encounter the below ‘Stack Empty’ issue while configuring the notifications in Sitecore. Please try again after some time or remove the conflicting handlers mentioned in the below article to get rid of this issue,

As a best practice, create individual endpoints/channels for different teams with only the intended workflow commands/publish content locations to reduce noise and improve work efficiency.

Source Code for this module is available in Github. This module is built for the Sitecore community and doesn’t require any license. This module doesn’t collect any of your information. Please check out and let me know if you have any feedback/issues/feature requests here. Thank you for using this module!

Happy Remote Working!

Improving SOLR Index Resilience: Preserving previously indexed data in SOLR during unexpected Indexing failures

While Search Platforms help to drive sites faster, they are also prone to stability issues. The chances of failures are directly proportional to the dependencies & customizations involved.

By default, when an exception is encountered during the indexing of a computed index field, the corresponding field will be excluded from the indexed document, thereby leading to invalid results in the front-end. In regards to Custom Crawler, whenever an exception/issue is encountered and not thrown, by default the entire index would be cleared. Hence it is essential to improve the SOLR index resiliency against these kinds of failures for a consistent user experience. This blog explains the approach to retain the previously indexed data in SOLR when the indexing operation fails or experiences any issue.

Prerequisities: Ensure Rebuild Core is setup for the SOLR index to support swapping of indexes during the rebuild,

High-Level Approach:


    • Update the ‘StopOnCrawlFieldError’ setting to true

    • Add appropriate validations in the computedindexfield to ensure that the right data in the expected format is received from dependencies/external services; else throw an exception. Catch any exceptions encountered and stop indexing as indicated below.

    • Create a new class for SearchIndex inheriting SwitchOnRebuildSolrSearchIndex. Resume indexing from the PerformRebuild method after the current rebuild is complete, so that the IndexingCustodian can execute the next queued job.

    • Change the type attribute of the index node in the index’s config to refer to the above created class.


    • Sitecore handles the exception happening within custom crawler code (created by inheriting FlatDataCrawler class) and stops indexing/swapping automatically . Hence it is important that an exception shouldn’t be caught within custom code. If the exception is caught for any reason, it must be rethrown, so that Sitecore can stop indexing.
    • In order to stop indexing/swapping for any error that may happen when Sitecore is processing the data to prepare index documents, the stopOnError property must be set to true within the crawler config node,

Now that the SOLR index is more resilient towards failures, it is vital to set up appropriate monitoring & alerting to get notified of the failures happening in the rebuild core to prevent stale results from being displayed on the site.

Happy Searching!

Debugging Sitecore dlls made easy with In-built Visual Studio Decompiler & Symbol Generator

Quite often, debugging Sitecore assemblies becomes very essential for developers, to investigate any exception or unintended behavior happening within Sitecore assemblies (or any Third-party managed code). While there are few options available currently to debug Sitecore assemblies with tools like dotPeek, dnSpy, etc., it would incur some effort to install and configure them to be able to debug.

Visual Studio by partnering with ILSpy has now made it easy by bundling this capability in Visual Studio 2019. This works great for both Docker-based and non-Docker-based implementations.

Visual Studio 2019 version 16.10.0 is required to use this feature, you may upgrade to 16.10.0 using the regular Visual Studio Installer.
Though this feature was introduced in Visual Studio 2019 version 16.5, decompiling Sitecore assemblies were experiencing the below ‘Unable to decompile the module.’ issue which was addressed in 16.10.0,


  • Navigate to Debug -> Options and uncheck the ‘Enable Just My Code’ option
  • Start debugging the application
  • Once the debugger hits a breakpoint, launch the Modules window by navigating to Debug -> Window -> Modules (or Ctrl+Alt+U)
  • Identify the module/dll(Eg: Sitecore.Xdb.MarketingAutomation.Tracking.dll) that needs to be debugged (You can also right-click on the intended function and navigate to its definition(metadata) to identify the related Sitecore dll)
  • Right-click on the identified module/dll and select ‘Decompile Source to Symbol File’ option. It may take up to 1-3 mins for decompilation depending on the size of the assembly.
  • Once the Symbol File generation is complete, add a Function Breakpoint by navigating to Debug -> New Breakpoint -> Function Breakpoint (or Ctrl K, B) and specify the Sitecore method name along with it’s namespace(Eg: Sitecore.Xdb.MarketingAutomation.Tracking.Extensions.ContactExtensions.GetPlanEnrollmentCache)
  • Press Continue or F5, you will now be able to step through the Sitecore or Third-Party code!

Decompiling Sitecore assemblies using this feature is currently possible only when the debugger is in break mode and the application is paused. Visual Studio enters break mode when it hits a breakpoint or an exception. Alternatively, Visual Studio can be forced to enter break mode by selecting Debug -> Break All after attaching the specific IIS process. There are a few known limitations, which are covered here.

For remote debugging, you will still be needing the Remote Debugger application in Target Environment.

Happy Debugging!

Chrome Extension: Simplifies navigation from Website pages(CD) to it’s associated content in Sitecore(CM)

This Extension enables the content authors to open associated content in Sitecore Experience/Content/Horizon Editors(Content Management) straight from Website(Content Delivery) in single-click.

The extension can be installed from the below link,
Sitecore Edit Assistant

To launch associated Experience Editor, Content Editor or Horizon links, you must right-click on the intended website page and select ‘Sitecore Edit Assistant’ to choose your preferred editing option as shown below. You will be asked to provide the authoring/CM host url, which will be stored in your local browser storage for subsequent requests.

IMPORTANT: For Multi-site implementation, you may experience the below issues if the site is not auto-resolving as expected,

  • ‘Edit in Content Editor’ might redirect to Sitecore login page again even if user is already authenticated.
  • ‘Edit in Experience Editor’ might redirect to 404 Page Not Found Error.

To mitigate this issue, you need to right-click and select the ‘Additional Settings (Multi-Site)’ option to mention the associated site’s name as specified in your site config(if non-SXA site) or /sitecore/content/<tenant>/<site>/Settings/Site Grouping/<site> item(if SXA site). Alternatively, if you do not have access to site configuration, you can preview a page item from Sitecore Content Editor and obtain the site name from ‘sc_site’ querystring of the preview URL as shown below,

NOTE: ‘Edit in Content Editor’ can find the associated item only when you are logged in to Sitecore. It might redirect you to Sitecore login page in case if you’re not logged in and you may want to try ‘Edit in Content Editor’ button again after logging in.

This extension also includes few shortcuts to accelerate the authoring process,
• Ctrl+Shift+X – Edit In Experience Editor
• Ctrl+Shift+E – Edit In Content Editor
• Ctrl+Shift+H – Edit in Horizon
• Ctrl+Shift+I – View Page Insights
Above default shortcuts can be modified from chrome://extensions/shortcuts if required

Complete Source Code for this extension is available in Github.
Help to improve this extension by sharing feature requests or by reporting any bug here.

Happy Authoring!

Launching Published Content Delivery Website URL for Page Items straight from Sitecore Content Editor

An efficient content authoring experience is crucial for the success of any Sitecore Implementation. Accelerating the authoring process by automating possible areas of authoring and publishing workflows, improves the efficiency of content authoring thereby helping content authors realize its exceptional value.

It is quite common that a content author will need to switch between CMS and Website to view/update content. Bridging the gap between CM and CD, allows seamless navigation and improves productivity.

Sitecore Powershell Extensions, makes this much easier with its Content Editor Integration Point and Invoke-JavaScript cmdlet. This blog covers the steps involved in setting up this Content Editor button.

Powershell Module(Eg: ‘Published Page Viewer’) can be created under /sitecore/system/Modules/PowerShell/Script Library using ‘Module Wizard’ Insert Option,

Ensure to select the ‘Content Editor’ Integration Point while creating the module,

Remove the items under Ribbon item except ‘Publish’. Build the following tree structure under ‘Published Page Viewer’ module item based on templates/insert-options indicated below,

  • Content Editor
    • Ribbon
      • Publish (PowerShell Script Library)
        • Publish (PowerShell Script Library)
          • View Published Page (PowerShell Script)


Below PowerShell Script shall be copied into the ‘Script body’ field of above created ‘PowerShell Script’ item,

IMPORTANT: Ensure that the Targethostname attribute of the site(s) in the site config (or Site Definition Item in case of SXA) holds the CD Domain, for this script to work as expected. If Targethostname doesn’t hold CD Domain, CD Domain can still be hard-coded in this script but not recommended. This script might need to be optimized if the default site resolving flow has been customized in the implementation.

Update the ‘Show Rule’ field in the Script Item to add ‘where the item has a layout’ rule, to ensure that the ‘View Published Page’ button appears only for page items.

Navigate to ‘PowerShell ISE’ from Sitecore LaunchPad. Select the ‘Settings’ tab and choose ‘Sync Library with Content Editor Ribbon’ from ‘Rebuild All’,

‘View Published Page’ option should now be available within the ‘Publish’ Chunk of the ‘Publish’ tab, using which Content Authors can navigate to respective CD URLs directly.

Happy Authoring!

Automating Sitemap Submissions to Google & Bing using Sitecore Powershell Extensions

Keeping Search Engines apprised of the content updates(additions/modifications/removals) is crucial for businesses to amplify user acquisition and to improve user experience. While Google’s scheduled recrawl may take up to weeks for scanning certain sites again, it is recommended to notify Google of essential content updates early to drive the needed traffic to gain maximum possible value.

NOTE: If the site holds short-lived content like live events, jobs, etc., you will also be able to benefit from Google Indexing API. This blog describes how to integrate Google Indexing API with Sitecore Publish.

With Sitecore Powershell Extensions, Sitemaps can be submitted to Search Engines’ Ping services quickly and flexibly.

Powershell Module(Eg: ‘SEO’) can be created under /sitecore/system/Modules/PowerShell/Script Library using ‘Module Wizard’ Insert Option,

Ensure to select the ‘Content Editor’, ‘Tasks’ and ‘Shared Functions’ Integration Points while creating the module,

Remove the items under Ribbon item. Build the following tree structure under SEO module item based on templates/insert-options indicated below,

  • Content Editor
    • Ribbon
      • SEO (PowerShell Script Library)
        • Sitemap (PowerShell Script Library)
          • Submit (PowerShell Script)
  • Functions
    • Invoke-PingService (PowerShell Script)
    • Submit-Sitemap (PowerShell Script)
  • Tasks
    • Submit Sitemap (PowerShell Script)


Below PowerShell Scripts shall be copied into the ‘Script body’ field of above created ‘PowerShell Script’ items,

Ensure to update the live site’s sitemap URL(s) within the above Submit-Sitemap function.
Navigate to ‘PowerShell ISE’ from Sitecore LaunchPad. Select the ‘Settings’ tab and choose ‘Sync Library with Content Editor Ribbon’ from ‘Rebuild All’,

‘Submit’ option should now be available within the ‘Sitemap’ Chunk of the new ‘SEO’ tab, using which Content Authors can make on-demand Sitemap submissions to Google, Bing, Yahoo, and DuckDuckGo Search Engines.

A Powershell Scripted Task can be created under /sitecore/system/Tasks/Schedules from Insert Options to automate submissions at a scheduled frequency,

The schedule frequency can be anywhere between few hours to few days for the site depending on the frequency of content updates happening and how frequently Google crawls the site (can be identified from the Google Search Console). Ideally, the Google bot crawl rate shouldn’t affect the site performance. If needed crawl rate can be optimized as per this documentation.

Bing’s index covers Yahoo and DuckDuckGo search engines, hence they do not require dedicated sitemap submissions.

Ensure that the lastmod field for the URL(s) are up-to-date in Sitemap file, as Google uses this field to determine if an URL is modified and if it requires crawling. It may take few minutes to few hours for the Search Engine bots to crawl the site once the request is submitted. Once Search Engine starts crawling sitemap, ‘Last read’ value gets updated with the current date in the Google Search Console,

Happy Crawling!

Request Google to crawl URLs on Sitecore Publish using Google Indexing API

Most of the sites hold a bunch of short-lived content like Events, Job Posting. While reaching the intended audience for short-lived content is challenging, removing the expired content from Search Engine is also vital for user engagement. This can be solved by bridging Google Indexing Mechanism with Sitecore Publish Mechanism using Google Indexing API, which empowers businesses to gain maximum value by reaching the right users at the right time.

IMPORTANT: Google Indexing API allows automating of Google Indexing only for short-lived pages like job postings or live events currently.


Create a Google API project using this Setup Tool
Navigate to API Dashboard of the newly created project, and select ‘ENABLE APIS AND SERVICES’

Search for ‘Indexing API’ and Enable the same for the project

Navigate to Credentials Tab, and create credentials for the project

Navigate to Credentials Tab, and select ‘Manage Service Accounts’

Select ‘CREATE SERVICE ACCOUNT’ button to create a new Service Account which will be used for sending indexing requests to Google,

Select ‘Actions’ ->  ‘Manage Keys’ to create new JSON API Key.

Store the downloaded JSON file safely, it is required to send Indexing requests to Google

Navigate to Google Search Console and then to the respective property. Select ‘Settings’ -> ‘ADD USER’ and add the Service Account(created earlier).
Select ‘Actions’ button of any existing Owner Account and select ‘Manage Property Owners’ to add Service Account as Owner to the Google Search Console property(only verified owner accounts can initiate indexing requests to Google)


This integration requires Google.Apis.Indexing.v3 Nuget Package, which needs to be added to the project(Depending on the Sitecore Version, you may also want to update the ‘oldVersion’ attribute of ‘bindingRedirect’ configured for ‘Newtonsoft.Json’ in web.config to as Google API looks for Newtonsoft.Json

Pages that were created/updated during Publish or Workflow Approval operations shall be captured by adding a custom processor within Publish pipeline and sent to Google as below,

An Event Handler for item:deleting event shall be added to capture the deleted page links and shall be sent to Google as below,

The above processor and event depend on the IndexingAPIHelper.cs and ItemExtensions.cs which needs to be added to the solution.

Copy the JSON file downloaded during the setup process to the website root folder and update the file name in GetGoogleIndexingAPIClientService method of IndexingAPIHelper class accordingly

The configuration is now complete! Indexing Requests for added/updated/deleted content will be sent to Google upon publishing. Ensure that the respective pages follow Google Structured Data Standards (JobPosting, BroadcastEvent). 

Indexing API requests can be monitored from Indexing API Metrics Tab

Please note that the default quota for Indexing requests is 200, you may want to request for a higher quota following the steps described here. Quota usage can be viewed from Indexing API Quota Tab.

Source Code is available in Github. Please do share your feedback below.

Happy Indexing!

In-Sitecore Alerts & Push Notifications: Effectively communicating Maintenance Activities to Authors & Marketers

Building a strong DevContentOps within the organization helps to keep the productivity of all teams at maximum. This module eliminates friction and enables seamless collaboration between Authors, Marketers, Developers, and Operations.

The module can be downloaded from the below links,
Sitecore Maintenance Notification-v1.0 – Sitecore Package
sitecore-maintenance-notification: – Docker Image

Integrating the above image into CM/MSSQL Containers requires few changes to .env, docker-compose/override, and Dockerfiles of CM & MSSQL Containers, as indicated in this screenshot (build is required for the changes to reflect). Alternatively, Sitecore packages can also be installed in Docker as described here.

Key Benefits:
• Optimized delivery of maintenance alerts through in-sitecore alerts and push notifications(works even if browser is not open) with appropriate auto-expiration
• Introduces Opt-in/Opt-out flexibility for Maintenance Notifications from Sitecore Control Panel.
• Displays Maintenance Page during Maintenance to avoid confusion.
• Displays Maintenance Alerts in User’s Local Timezone.
• Enables smooth handling of any Schedule Changes and Cancellations.
• Sends reminders before outage & Completion Message immediately after warm-up to keep content freeze at a bare minimum.
• Permits modifying Scheduled Alert messages, Reminder & Completion messages, Reminder Duration, etc. quickly from Sitecore

This module is built on top of Sitecore Powershell Extensions (SPE), please ensure SPE(v5.0+) is installed. 

After installing the module, navigate to ‘PowerShell ISE’ from Sitecore LaunchPad. Select the ‘Settings’ tab and choose ‘Sync Library with Control Panel’ from ‘Rebuild All’,

Below changes need to be introduced in web.config of CM instance. As a best practice, this transform file can be added to the solution. If there is any existing CM web.config transform, the below config can be appended to it.

  <location path="sitecore modules/Maintenance Notification/js/serviceworker.js">
                <add name="Service-Worker-Allowed" value="/" />

Module comes with default VAPID Keys but it is highly recommended to update the defaults for security reasons. New VAPID Keys can be created using web-push npm package. Below command can be executed to generate new Public/Private VAPID key pair,

npm install web-push -g
web-push generate-vapid-keys

Public Key needs to be updated in /sitecore/system/Modules/PowerShell/Script Library/Maintenance Notification/Notification Settings item. Private Key needs to be updated in the DevOps/Windows PowerShell scripts (described in the next section).

Module includes a default Sitecore API Key(works for 9.1+). But it is necessary to clone/create a new ‘OData Item API Key’ item(with same values) under /sitecore/system/Settings/Services/API Keys of master database for v9.1+ (make sure to publish to web). Item needs to be created under /sitecore/system/Settings/Services/API Keys of core database if v9.0.

For Impersonation User, it is recommended to create a new user(Eg: sitecore\MaintenanceNotification) with just read access only to the items under /sitecore/system/Modules/PowerShell/Script Library/Maintenance Notification. Item ID of this newly created ‘OData Item API Key’ item will need to be updated in DevOps/Windows PowerShell scripts (described in the next section).

Content Authors and Marketers will have to just select ‘Subscribe to Scheduled Maintenance Notifications’ of ‘Preferences/My Settings’ section of Control Panel, from normal mode of browsers (not incognito/inprivate) to receive maintenance notifications

DevOps team will have two options to send notifications during deployments,
DevOps Integration(Automated):
Module comes with two Powershell scripts,

The scripts require four variables. Since $sitecoreHostUrl, $vapidPrivateKey and $sitecoreAPIKey will be consistent across releases, they can be created as pipeline variables. VAPID Private Key and Sitecore API Key Item ID obtained in the above section needs to be assigned to $vapidPrivateKey & $sitecoreAPIKey. $sitecoreHostUrl will be the CM Domain Url. $maintenanceStartDateTime will need to be supplied as input for every release (Format: yyyy/MM/dd HH:mm EST  Ex:2021/01/18 10:00 EST). Script can be tweaked to accept deployment start date/time in another timezone if needed.

This script works with all the DevOps tools that support Powershell tasks. The initial demo uses Azure DevOps self-hosted agent to deploy to a windows machine.

Windows PowerShell(Manual):
Module includes a Windows PowerShell script (SendMaintenanceNotification(Optimized for Manual Notification).ps1), which can be used for sending/clearing notifications anytime without using DevOps tool.
Script requires deployment start date/time as input for every release, to send notifications accordingly. Sitecore CM Domain Url and VAPID Private Key & Sitecore API Key Item ID (created in the above section), needs to be updated in this script directly for $sitecoreHostUrl, $vapidPrivateKey, $sitecoreAPIKey variables.
This Script will automatically send scheduled alerts, reminders, and completion messages based on the input date/time to all subscribed users. Script will send notifications based on the availability of the site, hence do not close this powershell window till end of maintenance, else reminders and completion messages might be skipped.

The module configuration is now complete!

This module leverages the following popular libraries/modules,

  • Service Worker – will be installed in the browser with on click of Subscribe button for receiving notifications. Service Worker is supported in the recent versions of all latest browsers
  • Web-push NPM Package – used for sending optimized Push Notifications.
  • IDB-KeyVal library served via cdnjs – a tiny js imported by Service Worker into users’ browsers for easy storing/retrieving of values in/from Browser IndexedDB (helps to avoid loss of data upon browser/system restart)

This module also relies on the below Sitecore services to keep the details centralized and editable from Sitecore without impacting security,

  • Sitecore ItemService – for fetching Notification Settings and storing User Subscription information received from Service Worker
  • Sitecore OData Item Service – for reading the Subscriptions from Sitecore during DevOps/Windows PowerShell script execution


  • In the event of maintenance cancellation, running the Manual PowerShell script without specifying any input will automatically clear notifications for all subscribed users
  • If the maintenance time or details change, sending the updated time or details through Windows/DevOps PowerShell will overwrite the previously sent details for all users
  • There are chances that a user might have removed the service worker while clearing browser data. In such cases, PowerShell script will throw subscription expired error with failed user’s subscription endpoint url. This won’t affect notifications for other users, but it is recommended to manually remove the respective subscription json(along with date) from /sitecore/system/Modules/PowerShell/Script Library/Maintenance Notification/Push Subscriptions item.
  • If authors/marketers will need more lead time, adding Agentless jobs with Manual Intervention or Delay tasks will be helpful to communicate well in advance and will also free up agents for other tasks during the wait time. Alternatively, manual approach can be used just for initial communication.
  • Service Worker will require localhost or https. In case if the implementation doesn’t meet this requirement, Invoke-AddWebFeatureSSLTask powershell command might help to get a quick self-signed certificate.
  • Notification Settings can be updated if/as needed in /sitecore/system/Modules/PowerShell/Script Library/Maintenance Notification/Notification Settings,

Source Code for this module is available in Github.


Design Considerations and Approaches for Scheduling Recurring Tasks/Workflows

Being one of the key initial steps in the automation journey of the implementation, scheduling tasks is not necessarily easy and straightforward. Web applications have plenty of options to achieve this (Eg: Sitecore Scheduler, Windows Task Scheduler, Azure Logic Apps, Container CronJob, Coveo Push API, Hangfire etc.), but each option comes with pros/cons based on the needs/requirements. There are several basic and advanced considerations that needs to be thought before designing recurring scheduled tasks/workflows,

Changing demands/Extensibility – Certain jobs might require dynamic demands on flow or frequency. The approach needs to be flexible to accommodate the changing demands. Azure Logic Apps will be extremely useful in such cases, considering the extensive configuration options.
Review/Approval – Certain Jobs might need job administrator’s intervention to complete the job and the selected approach needs to accommodate this review/approval.
Logging for tracing issues – It is critical to capture the web job actions and errors, this will be helpful in troubleshooting issues. Log Retention needs to be defined prior and the log storage size needs to be validated during the designing phase.
Alerts on failed jobs – Early detection of issues is a key consideration for any kind of job, appropriate alerts on failure over preferred communication channel like Email, Slack/Teams etc. is highly essential.
Reporting – Based on the criticality of the job, job administrator needs to be intimated about success of job every time or on a daily/weekly basis.
Caching – In scenarios where a recurring job is required to interact with certain processed data on a regular basis, the data can be cached so that the scheduled task need not fetch/process the data every time. Azure APIM & Logic Apps comes handy with extensive caching options. Sitecore CustomCache can also be leveraged.
Triggers & Execution Flexibility – In order to understand the need for any on-demand execution/scheduling, the triggers for the job needs to be analyzed. This will help to determine the best approach that will enable the intended job administrator(Eg: Content Author, Infrastructure Admin etc.) to configure/schedule the job on-demand.
Retry on Failures – Based on the nature/frequency of the job, automating 2-3 retry attempts could be beneficial before alerting a failure
Manual/Automated cancellation – It is common that certain jobs need to be paused/cancelled for a particular period based on internal/external factors. Job administrator needs to be provided with expected permissions for handling such scenarios.
Infrastructure – Job Hosting Platform must be capable of handling the unforeseen load or else it could even result in a site outage.
Concurrency Constraints – Concurrency constraints like number of simultaneous jobs or daily allowed jobs for a user or type of job, need to be pre-defined
Documentation – Documenting the job procedure using Flowcharts/UMLs/Storyboard will not only assist job administrators but will also be supportive during maintenance/troubleshooting
Storage in case of import jobs – Appropriate storage needs to be defined for file-based web jobs for storing the artifacts involved (Eg: Azure Blob Storage, AWS S3 Bucket, etc. )
Need for running jobs on Holidays – Certain jobs need not run outside of business hours or during holidays due to unavailability of data. It is essential to capture these scenarios and schedule accordingly as it might save some bandwidth for the infrastructure.
Permissions – Besides providing permissions to job administrators for handling scheduled task, it is essential to restrict the access for unauthorized people to control/update the job.
Frequency of the Job – Almost all of the available options including Windows Task Scheduler allows scheduling in seconds. Frequency needs to carefully matched with the data availability, to avoid unnecessary load or delays.
Tools and Dependencies – Identification of Tools/Dependencies plays key role in defining the hosting of jobs. Desired dependencies needs to be accessible in a secure way by the scheduled job.

Here are certain key approaches that works well for Sitecore Implementations,

The Sitecore Way
Scheduling jobs in Sitecore comes handy with multiple options including Scheduled Task in Sitecore Interface, Scheduling Agents, SiteCron etc.  Sitecore Powershell & Remoting is a great option when looking to trigger based on external actions and when a flexibility is need to change flow/behavior without development. Utilizing Sitecore Scheduling options when there is no interaction/involvement of Sitecore might adversely affect the CM performance and hence need to be evaluated thoroughly.

The Cloud Way
Azure WebJobs can be utilized for scheduling if the task needs to run in the context of App Services. Otherwise Logic Apps is recommended for automating tasks/workflows. Azure Logic Apps comes with unique url allowing to invoke as and when needed. It also provides options to configure appearance of Reports/Emails. Azure Logic Apps will usually be combined with API Management (and Azure Functions certain times) for additional security (to allow whitelisting/blacklisting, Integration with AD etc.), Load Balancing & Failovers, Caching of responses for certain kind of requests, extensive Monitoring & Telemetry capabilities.
If the implementation uses AWS, Batch jobs, Lambda Functions, API Gateway services etc. can be utilized.

Windows Server
Windows Task Scheduler, being the most common job scheduling mechanism for IAAS/On-Prem instances, it is usually preferred for jobs which doesn’t involve Sitecore interactions and commonly achieved with Powershell. Windows Task Scheduler allows scheduling multiple actions for a specific trigger.

The Search Platform Way
There might be scenarios when external repository data needs to be pulled into Search Platform for presenting on the site as it is (without storing/versioning in Sitecore). In such cases, data can directly be imported into the Search Platform instead of utilizing Sitecore Scheduler to reduce load on CM, especially when the job is expected to process multitudes of data. SOLR allows importing data directly with update/dataimport handler and can be scheduled to run automatically with Powershell/Curl (which can be automated with the scheduling options available with Hosting Platform). For Coveo-based implementations, the same can be achieved with Out-of-the-box option or Coveo Push API and Scheduling.

The Container Way
It is certainly possible to run scheduled jobs from Docker/Kubernetes using Cronjob API and other open-source add-ons like Tasker, Ofelia etc. Image for the Cron Job needs to be built on top of base image(s). This option works well when the scheduled job is dependent only on the components within the container.

With the wide range of scheduling options available, spending time in evaluating the options & designing the job scheduling is crucial to achieve effective long lasting solutions.

Happy Scheduling!

Achieving End-User Visibility and Traceability for Sitecore Implementation with Azure AppInsights

It is quite common that an application or its integration doesn’t behave uniformly at all circumstances for all users, browsers, devices, regions etc. Most of the Performance issues and End-user exceptions are detected only after it is reported by an end user while it might be too late as it would have affected the user engagement and retention already.

Channel 9 Reaction GIF by LEGO Masters Australia (GIF Image)

DevOps needs automated feedback from end-users to facilitate a stunning experience for them

Setting up real-time instrumentation is a key solution that allows us to gain end-user visibility and traceability to obtain performance and experience feedback of our site in real-time from all users, browsers, devices and all possible segments. Depending upon the current platform and level of visibility/feedback needed, there are several real-time instrumentation solutions available ranging from Pingdom RUM, Site24x7 RUM etc. to New Relic, Appdynamics etc.

AppInsights JS SDK:
AppInsights JS SDK allows capturing client-side experience in Azure AppInsights. This Open-source SDK helps recording several critical insights,

  • Real-user page load times
  • Real-user load times of AJAX calls, js/css, media, external dependencies etc.
  • External dependency load failures and status code
  • Client-side Exceptions, Stack Trace
  • Request/response headers for AJAX requests, js/css, external dependencies etc.
  • User time spent on pages
  • User Device, OS etc.
  • User IP, Location etc. (if enabled)
  • Correlation between client-side and server-side requests for an operation
  • Browser Link Requests tracking

AppInsights JS SDK also allows recording of custom attributes as well (Eg: User Profile Attributes, Segments etc. but make sure to review your Compliance Strategy – GDPR, CCPA etc.) in AppInsights, which is a key differentiator. Telemetry capturing can be disabled for certain users based on custom conditions as well.

Setup & Configuration:
AppInsights JS SDK supports both Snippet-based Integration and NPM-based Integration.

If the implementation leverages NPM (Eg: JSS), NPM-based Integration can be preferred. The below command shall be used for installation of the module,

npm i --save @microsoft/applicationinsights-web

The below command shall be used if looking for light version,

npm i --save @microsoft/applicationinsights-web-basic

The javascript code that needs to be integrated within layout file to allow it to render on all pages shall be found here.

Snippet-based Integration needs to be opted when the implementation doesn’t use NPM. The javascript snippet that needs to be added to the layout can be found here. It is recommended to keep it as high as possible in the head tag to allow it to capture the errors occurring in all the dependencies. HTML Snippet component could be used for Sitecore SXA based implementations.

Instrumentation key for AppInsights can be found from the respective AppInsights resource,

Depending upon the no. of Environments & DevOps process for the implementation, Instrumentation key could be stored in config, environment variables, Azure Application Settings, CI/CD Pipeline Variables etc. if/as needed.

Once the setup is complete, AppInsights JS SDK starts sending the data from the user browser to Azure AppInsights and can be verified from the Network tab,

Data will be recorded in Azure AppInsights within 2-3 mins usually.

Page Views table in Azure AppInsights which will usually be empty, will now see real-time records along with any custom data posted,

Dependencies table will start recording the AJAX calls, dependencies etc.

Exceptions table will now start recording client-side exceptions and failures as well,

Requests, Dependencies and Exceptions tables will also continue to capture server-side application monitoring data as usual.

With AppInsights JS SDK’s unique operation id and custom ids (Eg: Session Id etc.), the client-side page views, dependencies and exceptions can easily be correlated with server-side dependencies/requests, facilitating early identification, quicker tracing & faster resolution of critical production end-user issues and thereby providing a stunning experience for all customers.

Happy Monitoring!