How To Read A File In D365

Share this:

Instead of manually typing in data, users can upload a file, then read a file in D365. I will show you how to use helper classes to do both of these things. This is especially helpful when you have many records to bring in. There are several other ways that data can be brought into D365 in bulk. The most common, and preferred way, is to use the data management framework and data entities. These can consume data such as XML, JSON as well as files. However, there are some scenarios where it may make sense to use the file upload capability within Microsoft Dynamics 365 for Finance and Operations. In my view, the file upload code can sometimes be faster to develop, and easier for an end user to use, compared to using the data management framework. That may or may not be a good reason to use this approach. Either way, I wanted to explain how write x++ code to upload a file in D365.

AX 2012 Vs D365

Before getting into the details, there is an important piece you need to understand. The way you need to read a file in D365 is different than the way you did it in AX 2012 and prior versions.

In prior versions, Dynamics ran in an installed application on your computer. This means that you could write code that would read a file directly from your local hard drive. You could even specify a folder path you wanted to load the file from.

In contract, D365 F&O runs as a website in a browser. This means that it does not have access to your hard drive directly. Instead, you need always need the user to click an upload button, and specify which file to read.

The file is then uploaded to Azure temporary blob storage. Your x++ code can then read the file from there.

Upload A File

Let us review the code needed to upload and read a file in D365. Afterwards, we will walk through how to read and parse each line of the file.

First, in Visual Studio, create a new D365 project. Set the model of the project to be your custom model.

Second, create a new class. For this example, I named the class “TutorialUploadFile“. Then, copy the following code.

class TutorialUploadFile
{
    public void readAndProcessFile()
    {
        FileUploadTemporaryStorageResult uploadFileResult = File::GetFileFromUser();
        if(uploadFileResult && uploadFileResult.getUploadStatus())
        {
            //put process file code here.
            this.processFile(uploadFileResult); //We will write this method further down in the article.
        }
        else
        {
            throw error("Unable to load file");
        }
    }
}

While it is possible to create your own custom dialog form, with a ‘file upload‘ control, it is much easier to base Microsoft code to do this for you.

Specifically, the code “File::GetFileFromUser()” will do all the work of opening a dialog and adding a ‘file upload‘ control. The method will return a variable of type “FileUploadTemporaryStorageResult“, which can be used to read the file from Azure temporary blob storage.

To be safe, first check to see that the result is not null, and call “uploadFileResult.getUploadStatus()” to ensure the file uploaded properly.

Next, learn how to add code that reads the contents of the file.

CommaIO, CommaTextIO, TextIO, AsciiIO are deprecated

Before writing code to read a file in D365, there are quite a few different classes in x++ that can help you.

In the past, and in other examples online, you may have see CommaIO, CommaTextIO, TextIO, or AsciiIO. These versions are all now deprecated. You will get a compile warning if you try to use them. And they likely will not work.

Instead, these classes have been replaced by CommaStreamIO, CommaTextStreamIO, TextStreamIO, and AsciiStreamIO respectively.

Which Helper Class To Use?

Interestingly, all of the helper classes are very similar. For the most part, they essentially just have different ways of instantiating the StreamReader class, with a file encoding. Additionally, the “Comma*” classes set the default field delimiter to a comma. All set the ‘end of line’ character to be a carriage return.

  • CommaStreamIO: Creates a StreamReader object, using the default UTF-8 encoding. Therefore, the file you read must be UTF-8 encoding compatible. Additionally, this class sets the field delimiter to a comma, ‘,‘ by default.
  • CommaTextStreamIO: Creates a StreamReader object, using the byte order marks to automatically identify and set the encoding. Additionally, this class sets the field delimiter to a comma, ‘,‘ by default.
  • TextStreamIO: Creates a StreamReader object, using the byte order marks to automatically identify and set the encoding. The field delimiter is not set by default. Therefore, users should use the inFieldDelimiter(“,”) method to set the field delimiter.
  • AsciiStreamIO: Creates a StreamReader object, setting the encoding to ASCII.. The field delimiter is not set by default. Therefore, users should use the inFieldDelimiter(“,”) method to set the field delimiter.

See the Microsoft documentation for System.IO.StreamReader for more information. Also, see the System.Text.Encoding class documentation to learn more about different file encodings.

If you want to use the most generic approach, I recommend using the TextStreamIO class. Just remember to set the field delimiter and record delimiter to your use case.

Below, I will show you how to use each helper class.

Read The File

After you upload a file in D365, you need to read a file in D365. Specifically, the data in the file. Copy and paste the following code into your class. In my example, I am using the TextStreamIO class. However, I left a commented out line of code for how to use each of the other classes.

public void processFile(FileUploadTemporaryStorageResult _uploadFileResult)
    {
        str firstName;
        str lastName;
        container recordContainer;
        tutorialNames tutorialNames;

        str fileURL = _uploadFileResult.getDownloadUrl();
        System.IO.Stream stream = File::UseFileFromURL(fileURL);
        TextStreamIo inFile = TextStreamIo::constructForRead(stream);
        //AsciiStreamIo inFile = AsciiStreamIo::constructForRead(stream);
        //CommaStreamIo inFile = CommaStreamIo::constructForRead(stream);
        //CommaTextStreamIo inFile = CommaTextStreamIo::constructForRead(stream);

        inFile.inFieldDelimiter(','); //Replace the comma here with your separator.
        inFile.inRecordDelimiter('\r\n'); // \r\n is a carriage return or 'end of line'.

        //read() reads the line and returns a container,
        //based on the inFieldDelimiter and InRecordDelimiter specified.
        recordContainer = inFile.read();

        while (inFile.status() == IO_Status::OK)
        {
            firstName = conPeek(recordContainer,1);
            lastName = conPeek(recordContainer,2);
            Info(strFmt("FirstName: %1 LastName: %2", firstName, lastName));
            
            //Insert the data into the tutorialNames table if it does not exist already.
            select firstonly tutorialNames
                where tutorialNames.FirstName == firstName
                && tutorialNames.LastName == lastName;

            if (!tutorialNames)
            {
                tutorialNames.FirstName = firstName;
                tutorialNames.LastName = lastName;
                tutorialNames.insert();
            }

            //Read the next line
            recordContainer = inFile.read();
        }
    }

Explaining The Code

To explain, let us walk through how the code works.

Read The File Data

First, notice the FileUploadTemporaryStorageResult object passed into the method. This object contains all the information about the file we just uploaded.

Second, see these three lines of code:

        str fileURL = _uploadFileResult.getDownloadUrl();
        System.IO.Stream stream = File::UseFileFromURL(fileURL);
        TextStreamIo inFile = TextStreamIo::constructForRead(stream);

The method “getDownloadURL” gets the azure temporary blob storage path to the file that was uploaded.

Next, the “File::UseFileFromURL(fileURL)” method create a Web Client memory stream to the file’s contents.

Finally, TextStreamIO, in this case, instantiates a class that makes it easier for us to read each row and field from the file. You can replace this helper class with one of the others as the scenario allows.

Set The Field Delimiter and Record Delimeter

Whenever you read a file in D365, it is important to specify the character that separates each field. Likewise, you should also specify the record delimiter. See the following code.

inFile.inFieldDelimiter(','); //Replace the comma here with your separator.
inFile.inRecordDelimiter('\r\n'); // \r\n is a carriage return or 'end of line'.

In this example, pass the ‘comma‘ character into inFieldDelimiter since we will load a .csv file. However, you can use this method to set it to whatever your file uses. Such as a pipe character, ‘|’, or tab.

Similarly, the inRecordDelimiter method sets what character separates each row. Most often, a row is separated by a carriage return, also called an ‘end line‘. This is represented in code by ‘\r\n‘.

Read Each Row Of The File

Finally, after specify all the information needed for the system to properly read a file in D365, we can actually loop through each record. See the following code:

        //read() reads the line and returns a container,
        //based on the inFieldDelimiter and InRecordDelimiter specified.
        recordContainer = inFile.read();

        while (inFile.status() == IO_Status::OK)
        {
            firstName = conPeek(recordContainer,1);
            lastName = conPeek(recordContainer,2);
            Info(strFmt("FirstName: %1 LastName: %2", firstName, lastName));
            
            //Insert the data into the tutorialNames table if it does not exist already.
            select firstonly tutorialNames
                where tutorialNames.FirstName == firstName
                && tutorialNames.LastName == lastName;

            if (!tutorialNames)
            {
                tutorialNames.FirstName = firstName;
                tutorialNames.LastName = lastName;
                tutorialNames.insert();
            }

            //Read the next line
            recordContainer = inFile.read();
        }

First, the code reads the first line of the file and stores it into a container. If your file has a header, with column names, duplicate this line to read the first line in the file with actual data.

Second, the code as a ‘while loop‘ that keeps looping until there are no more records to read, and that status no longer returns “IO_Status::Ok“.

Third, the code inside the ‘while loop’ reads each field from the container. In this example, my file only has two columns, so I call conPeek only twice, changing the second parameter for each call.

            firstName = conPeek(recordContainer,1);
            lastName = conPeek(recordContainer,2);

Finally, do something with the data you have read. In this example, I wrote code to check to see if a record in the table ‘tutorialNames‘, already existed. (tutorialNames is a custom table I created for this example). If the record does not already exist, populate a table buffer variable, then insert the record.

Test Upload A File

Now that the code is written to upload a file in D365, and read a file in D365, we can test the process.

To be able to run our code, we need to do three things first.

Create Runnable Class

First, create a class that can call the class we wrote, and can be called from an action menu item. In Visual Studio, right click the project, and select Add>New Item. Next, in the dialog that pops up, select Dynamics 365 Items>Runnable Class (Job) and type ‘CallTutorialUploadFile‘ as the name. Feel free to replace this with a name you choose. Then, click the ‘Add‘ button.

Copy the following code into the class. This code create a ‘main‘ method, allowing this class to be called from a menu item. Inside the class, the code calls the ‘upload file’ class we created earlier. If you named your class something different, use that name here.

class CallTutorialUploadFile
{
    public static void main(Args _args)
    {
        TutorialUploadFile tutorialUploadFile = new TutorialUploadFile();
        tutorialUploadFile.readAndProcessFile();
    }
}

Create Action Menu Item

Second, create an action menu item to call the class you created in the previous step. In Visual Studio, right click the project, and select Add>New Item. Next, in the dialog that pops up, select Dynamics 365 Items>Action Menu Item and type ‘CallTutorialUploadFile‘ as the name. note, it is a best practice to name the menu item the same as the object it calls. Feel free to replace this with a name you choose. Then, click the ‘Add‘ button.

Next, open the menu item. Then, right click the menu item, and select ‘Properties‘.

Set the Label property. In my example, I set it to “Call tutorial upload file“. This what will show on the D365 menu.

Then, set the Object Type property to ‘Class‘. Lastly, set the Object property to ‘CallTutorialUploadFile‘, or whatever you named your class.

Note, whenever a class is called by an action menu item, the system will call the ‘static main‘ method on that class.

Extend A Menu

Third, extend a Menu, and add the action menu item to it. In this example, we will add the menu to the ‘Accounts Receivable‘ Menu.

Depending on whether you have already extended this menu in the past, you will take a different action.

If you have never extended the Menu before, go in the Application Explorer to AOT>User Interface>Menus>AccountsReceivable. Then, right click and select ‘Create extension‘.

However, if you have extended this menu before, go to Application Explorer to AOT>User Interface>Menu Extensions>AccountsReceivable. Then, right click and select ‘Add to project‘.

Finally, open the Menu, and drag the ‘Action menu item‘ you created in the previous step to the location in the menu you want it to be shown. In my example, I dragged the ‘CallTutorialUpload‘ action menu item to the a new SubMenu named ‘UploadFiles‘.

Test It Out

Finally, we are ready to read a file in D365.

Compile your solution, then navigate to your D365 instance in a browser.

Next, select the ‘call tutorial upload file’ menu item.

When the dialog pops up, click the ‘Browse‘ button.

Select the appropriate file from your computer. In this example, I select a .csv file with two columns: A firstName column, and a LastName column. Then, click ‘Open‘.

Feel free to look at this file, as an example:

Wait, a moment for the file to upload. Once the file is uploaded, your code will run.

I can see that the files were successfully inserted into my custom table, and form, named TutorialNames.

Conclusion

In this lesson you learned how to upload and read a file in D365. Compared to previous versions of Dynamics, the process and code needed has changed. You now need to upload the file to Azure temporary blob storage, then read the data within the file. However, this is made easy by using the File and TextStreamIO classes. While there are other ways of reading data into D365, this can be helpful in certain scenarios.

Peter Ramer
Peter Ramer is a part of the Managed Application Services team at RSM working on Microsoft Dynamics 365. He focuses on the Retail and Commerce industries. When he is not solving problems and finding ways to accelerate his clients' business, he enjoys time with his three kids and amazing wife.

Share this:

11 thoughts on “How To Read A File In D365

Add yours

  1. Super helpful for me, thank you! One question I have is whether there are any published limitations on the temporary file storage. My main concern is how long will the file be available in the temporary storage, is there a process that cleans these files up? Do we need to be responsible to clean them up or Microsoft will take care of that? For example, can we have a batch dialog prompt a user for the file, the file goes into temporary storage, and the batch data contract gets the download URL to the file so that the batch job can process it. Is it safe to assume that the file will still be there in the temporary storage by the time the batch job starts or is it possible it can go out of scope or get cleaned up?

    Thank you again for the great article!

    1. This is a great question. I do not know for sure. But from my experience, and I would imagine, that Microsoft does keep the file in temporary storage until the file is processed. I have not run into a scenario where the file no longer existed when the job ran.

      Let me know if you see anything different. 🙂
      You don’t have to clean up the file yourself. Microsoft will handle it.

    1. You could hide all the other columns in the grid except the one field you want to export and then export to excel. Or, after exporting just delete or hide any columns you don’t want to see.

  2. This is a very helpful post showing the simplest way to read and process the file using the Browse button. I have one question, is there a way to restrict the file types for uploading? For example, I only want to select one of the CSV files.

    1. I know there used to be a way to filter by the file type. I think that now in D365 the actual file explorer that opens might be out of the control of the browser. But I could be misremembering on that. I will have to do some checking and get back to you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑