How To Run A D365 Job In Production

Share this:

There are scenarios where something cannot be fixed in Microsoft Dynamics 365 for Finance and Operations without running some code. In those cases you need to write and run a D365 Job. Until recently, in order to get this code into a Production environment, you needed to apply a package in Lifecycle Services (LCS). This would make the environment unusable for up to 5 hours while the package was applied. In this article, I will explain how to run a D365 job in Production with no downtime.

Overview

A D365 job, or custom script as they are sometimes called, can up applied to a Test, Stage, or Production environment in two ways.

The first is to apply the job as part of an all-in-one code package using Life Cycle Services. This option takes fewer steps, however, it requires brining the environment down for potentially several hours. If you are already planning to promote other code to the environment, a D365 runnable job can be included.

The second option is to create a new model, write the job as part of that model, and create a deployable package of just that model. Then, upload that deployable package to the destination environment, approve it, test it, and finally run it. This requires a lot more steps. However, it requires zero downtime. And can be done without delay.

Write A D365 Runnable Class (Job)

Both options require you to write a runnable class, also called a ‘job’.

Previously, I wrote an article on how to write a D365 runnable class (job). That article will explain the detailed steps on how to create a project in Visual Studio and write the code needed for this type of job.

To start, follow the steps in that article to write the job. Afterwards, I will explain how you can now move that code into Production without applying a package and bringing the environment down.

Apply a Deployable Package Through LCS

Before, explaining the new steps, I want to explain the old steps so that you understand the differences, and how great this new feature is. These are the steps to run a job when deploying the code as a package through Life Cycle Services (LCS).

Due to the number of steps, I will save these specific steps for another article. However, I wanted to show that the previous steps to move the code require bringing down Stage and then Production for a period of time. The new steps do not.

First, write the x++ runnable class (job), and then check the code into source control.

Second, run a build pipeline in DevOps to take all of the custom source code and build a deployable package.

Third, download the package from DevOps and upload it to the asset library in Lifecycle Service (LCS). This is located at lcs.dynamics.com.

Fourth, in LCS, navigate to the environment detail page. Then, select Maintain>Apply updates.

Fifth, select your package from the list of software deployable packages. Then click the ‘Apply’ button.

Applying A Package To Production

Note, packages usually are applied to Tier 2 machines, and not Cloud Hosted development machines. Development machines are connected to source control, and to get the latest code a ‘get latest’ can be performed. However, there are still exceptions where a package may need to be installed to a development environment using the command line. This is for code that is not written or owned by you.

Additionally, to move code to a Production environment, code first needs to be applied to a STAGE environment. And then from there, the package is marked as a ‘release candidate’ and then scheduled to be applied to Production.

Previously, after all these steps, the user would go to a specific URL to finally run the job.

Create A New Model and Job

Now, there is a process, that while a few more steps, make it so there is zero downtime in PROD.

In order to Run a D365 job in Production without any downtime, there are still a number of steps that need to be performed. To start, below are the steps for creating a new model and writing the x++ job. , and creating a deployable package.

First, even before writing the job, create a new Model for storing our code. This model is different than the Model we store the rest of the custom code. The purpose of this model is to only store our job that we write. See my article: How to Create A Model In D365. I named my project “dynamics365musingsRunnableJobOne“.

Create A New Project

Secondly, create a new project in Visual Studio, setting the model on that project to the one you created in step one. See my article: How to Create A D365 Project In Visual Studio. First, right click on the project, and select Properties. Then set the ‘Model‘ property to the name of the Model you just created.

Create A New Job

Thirdly, write the x++ runnable class (job) using Visual Studio. See the article How to Write a D365 Runnable Class (Job).

For this example, name the job “Dynamics365mustingsRunnableJobOne“. The job name needs to be unique each time.

Next, it is important to say again that this process should typically only be used to modify data that cannot be modified using the user interface. Jobs can sometimes bypass validation which is not desirable. Therefore, you should always use the user interface to make data changes when possible. However, there are some cases where you need to run a D365 job.

In this example, I wrote a job to change the customer group on a particular customer in the ‘USRT’ company. While this can definitely be done using the user interface, I wanted to provide an example that would work on demo data and be easy for everyone to try themselves.

class Dynamics365musingsRunnableJobOne
{
    /// <summary>
    /// Change the customer group to 80 or 30.
    /// </summary>
    /// <param name = "_args">The specified arguments.</param>
    public static void main(Args _args)
    {
        CustTable custTable;

        ttsbegin;
        custTable = CustTable::find("004003", true);

        if (custTable.CustGroup == "30")
        {
            custTable.CustGroup = "80";
        }
        else
        {
            custTable.CustGroup = "30";
        }
        custTable.update();

        ttscommit;

        Info(strFmt("Update customer group on customer %1 to %2", custTable.AccountNum, custTable.CustGroup));
    }

}

Notice, the job itself will change the customer group to ’80’ if it is currently ’30’. Similarly, the job will change the customer group to ’30’ if it currently is ’80’. This way, the job can be run multiple times and a change can be seen.

Create Deployable Package

After creating the x++ job in a new Model, we need to create a deployable package of just this code.

For this, use Visual Studio to create a Deployable Package of this Model. Typically, deployable packages are created using a build pipeline in DevOps. However, in this case, this model will only be used to store this custom job. Therefore, we can just use Visual Studio directly.

In Visual Studio, go to Dynamics 365 > Deploy > Create Deployment Package. Next, check the checkbox next to the new Model you created for storing this job.

Create Deployment Package

Then, specify a location to save the deployable package file. Finally, click the ‘Create‘ button.

Notice, a file will be created in the Package file location specified.

Feel free to review Microsoft’s documentation on these steps.

Assign Duties To Users

At this time, we have the deployable package. Before we can upload and run a D365 job in Production, we need to provide the required security roles to allow this.

As part of this process, we need two users.

  1. First, one user needs the ‘Maintain custom scripts‘ duty. This duty allows this user to upload the package, test, and run the custom x++ script, or job.
  2. Secondly, a different user needs the ‘Approve custom scripts‘ duty. This duty allows the user to approve a custom x++ script (job) that has been uploaded by the first user.

Note, these are security ‘duties‘ and not roles. Therefore, you will not be able to add them directly on the Users form in D365 by clicking on the ‘assign roles‘ button. However, you can use the System Administration>Security configuration from to add a duty to an existing role. Then, assign that role to a user. See my article Customize Security in D365 for more detailed instructions on how to do this.

After adding the duties to an existing role, go to System Administration>User>Users. Select the user to view the details of that user. Then, click the ‘Assign roles‘ button to add these roles to the user.

See my article Security in D365 for more detailed steps.

Ultimately, the reason for having two separate users is so that no single user can upload and run the job in Production. A different user always has to approve a job before it can be run. This is to minimize the risk of malicious action or accidents. Even if a user is assigned both duties, that user still won’t be able to approve their own scripts.

Upload The Deployable Package

Next, in order to run a D365 job in Production, we need to upload and run the package in a new form in D365.

Note, the following steps are actually the same for any Tier 2 environment. For example, TEST, STAGE, and Production. Therefore, it is highly recommend you perform these steps first in a TEST or STAGE environment, before performing them in Production. This helps to ensure the job works as expected.

First, go to System adminstration>Periodic tasks>Database>Custom scripts. Note, if you do not see this menu item option, make sure that you are in a Tier 2 environment with the proper security roles.

Second, click the ‘Upload’ button.

Third, click the ‘Browse‘ button and specify the deployable package that you created earlier. Additionally, you must enter text into the ‘Purpose’ box. Finally, click the Upload button on the dialog box.

Approve The D365 Job

After uploading the deployable package, before you can run a D365 job, you need a user to approve the job. However, this user must be a different user than the user that uploaded the deployable package.

Since jobs can be dangerous to run, the purpose of this approval is to ensure multiple people have approved the job before it is run.

First, go to System adminstration>Periodic tasks>Database>Custom scripts.

Second, select the script to be approved.

Third, click the ‘Details‘ button.

Fourth, click the ‘Process workflow‘ tab. Then, click the ‘Approve‘ button.

Finally, click ‘Yes‘ on the dialog that pops up.

Importantly, if the same user that uploaded the package tries to approve, this error message will be shown.

After the script has been approved, the script must be tested.

Test The D365 Job

Next, before you can run a D365 job, it must go through a test run. Either the uploader or the approver can run this test.

First, go to System adminstration>Periodic tasks>Database>Custom scripts.

Second, select the script to be approved.

Third, click the ‘Details‘ button.

Fourth, click the ‘Process workflow‘ tab. Then, click the ‘Run test‘ button.

The system will finally run the job, however, it will not commit any changes made by the job. Review the logs and ensure the results are expected. In the ‘Process workflow‘ tab click ‘Accept test log‘ if the job ran as expected. Otherwise, click ‘Abandon‘.

Run The D365 Job

Finally, you can run a D365 job in Production. After the job has been uploaded, approved, and run once as a test, it can be run for real.

Under the ‘Process workflow‘ tab, click the ‘Run‘ button.

This time the job will be run and any changes will be committed to the database.

Lastly, under the ‘Process workflow‘ tab, click either ‘Purpose resolved‘ or ‘Purpose unresolved‘.

Both options will lock the job, and a user cannot run it again. The difference is that the ‘Purpose resolved‘ will update the event log indicating that the script ran successful. Whereas the ‘Purpose unresolved‘ will update the event log indicating that the script run unsuccessfully.

Microsoft Documentation

Microsoft has also created some good documentation on this process. Please feel free to review it as well.

Conclusion

The ability to run a D365 job in a production environment is extremely helpful. It allows users to fix data that cannot be fixed using the user interface. Until recently, the only way to move these jobs into Production was to promote them with other custom code. This required promoting the change first to a Stage environment, then scheduling a promotion to Production. Which included bringing the Production environment down for several hours. This was not ideal when the fix was urgently needed. Or when brining the environment down could have negative impacts to the business.

Now, users can run a D365 job in Production with zero downtime. There are some additional steps required for this option. However, this option is extremely helpful in the right conditions.

Peter Ramer
Peter Ramer is a part of the Managed Application Services team at RSM working on Microsoft Dynamics 365. He focuses on the Retail and Commerce industries. When he is not solving problems and finding ways to accelerate his clients' business, he enjoys time with his three kids and amazing wife.

Share this:

4 thoughts on “How To Run A D365 Job In Production

Add yours

  1. Hi Peter, thanks for lot of usefull articles. Please, is there a way, that I can reuse code with little adjustment, but without making new model ? When I try to use new deployable package from the same model : A deployable package with the assembly signature ‘Dynamics.AX.XYZ_FIX, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null’ already exists. The system will reuse the implementation from the existing package. Do you want to continue?

  2. Peter,
    Until I read this, I had no idea this was even possible:

    I have a couple of questions related to this.

    1. Does every new runnable class like this require a separate model or can they be placed under a single model?
    2. These runnable classes can be checked into TFS just like any other code, correct? Is there a reason not to check it in?

    Thanks in advance!

    1. Hi Russell. I am glad to have shared something new with you.
      Originally I thought that you did need a new model every time you wanted to run the same job again, but I do believe you can just upload the same model a second time if you want to run the exact same job again.
      If you do write a new job that contains different code, you do need to create a new model each time.

      You can check this model into source control. But unlike your core code there really isn’t a need to. The idea is your core code needs to be promoted to all environments, dev, test, and prod. Whereas these runnable jobs are just meant to be used for one time fixes to ver specific problems. So there isn’t need to have them checked into source control and promoted between code branches and all environments. But it doesn’t hurt anything if you do check the code in.

Leave a Reply

Your email address will not be published. Required fields are marked *

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑