Advertisements

Database Replication – Part V

This post is in continuation to my earlier post. Please check if you missed.

Database Replication – Part I

Database Replication – Part II

Database Replication – Part III

Database Replication – Part IV

As committed in this post we will continue to cover practical approach, Next step from last post.

We will create a subscription using SQL Server Management Studio

To create the subscription

  • Connect to the Publisher in SQL Server Management Studio, expand the server node, and then expand the Replication folder.
  • In the Local Publications folder, right-click the Nav2018ItemTrans publication, and then click New Subscriptions.

DR-39

The New Subscription Wizard launches.

DR-40

  • On the Publication page, select Nav2018ItemTrans, and then click Next.

DR-41

  • On the Distribution Agent Location page, select Run all agents at the Distributor, and then click Next.

DR-42

  • On the Subscribers page, if the name of the Subscriber instance is not displayed, click Add Subscriber, click Add SQL Server Subscriber, enter the Subscriber instance name in the Connect to Server dialog box, and then click Connect.
  • On the Subscribers page, select the instance name of the Subscriber server, and select under Subscription Database.
  • On the New Database dialog box/Select from Drop Down List, enter Nav2018ReplDatabase in the Database name box, click OK, and then click Next.

DR-43

  • In the Distribution Agent Security dialog box, click the ellipsis (…) button, enter <Machine_Name>\repl_distribution in the Process account box, enter the password for this account, click OK.

DR-44

  • Click Next.

 

DR-45DR-46DR-47DR-48DR-49

  • Click Finish to accept the default values on the remaining pages and complete the wizard.

Setting database permissions at the Subscriber

  • Connect to the Subscriber in SQL Server Management Studio, expand Databases, Nav2018ReplDatabase, and Security, right-click Users, and then select New User.
  • On the General page, in the User type list, select Windows user.
  • Select the User name box and click the ellipsis (…) button, in the Enter the object name to select box type <Machine_Name>\repl_distribution, click Check Names, and then click OK.

DR-50

  • On the Membership page, in Database role membership area, select db_owner, and then click OK to create the user.

DR-51

When setting up SQL Server replication you might see an error message from the Transactional Replication Log Reader Agent which reads like the following.

Error messages:

  • The process could not execute ‘sp_replcmds’ on ”. (Source: MSSQL_REPL, Error number: MSSQL_REPL20011) Get help: http://help/MSSQL_REPL20011
  • Cannot execute as the database principal because the principal “dbo” does not exist, this type of principal cannot be impersonated, or you do not have permission. (Source: MSSQLServer, Error number: 15517) Get help: http://help/15517
  • The process could not execute ‘sp_replcmds’ on ”. (Source: MSSQL_REPL, Error number: MSSQL_REPL22037) Get help: http://help/MSSQL_REPL22037

Often this error message can come from the database not having a valid owner, or the SQL Server is not being able to correctly identify the owner of the database.

Often this is easiest to fix by changing the database owner by using the sp_changedbowner system stored procedure as shown below. The sa account is a reliable account to use to change the ownership of the database to.

USE PublishedDatabase

GO

EXEC sp_changedbowner ‘sa’

GO

Once the database ownership has been changed the log reader will probably start working right away. If it doesn’t quickly restarting the log reader should resolve the problem.

While this does require changes to the production database, there is no outage required to make these changes.

To view the synchronization status of the subscription

  • Connect to the Publisher in SQL Server Management Studio, expand the server node, and then expand the Replication folder.
  • In the Local Publications folder, expand the Nav2018ItemTrans publication, right-click the subscription in the Nav2018ReplDatabase database, and then click View Synchronization Status.

DR-52

The current synchronization status of the subscription is displayed.

DR-53

  • If the subscription is not visible under Nav2018ItemTrans, press F5 to refresh the list.

Validating the Subscription and Measuring Latency

We will use tracer tokens to verify that changes are being replicated to the Subscriber and to determine latency, the time it takes for a change made at the Publisher to appear to the Subscriber.

To insert a tracer token and view information on the token

  • Connect to the Publisher in SQL Server Management Studio, expand the server node, right-click the Replication folder, and then click Launch Replication Monitor.
  • Replication Monitor launches.
  • Expand a Publisher group in the left pane, expand the Publisher instance, and then click the Nav2018ItemTrans publication.
  • Click the Tracer Tokens tab.
  • Click Insert Tracer.
  • View elapsed time for the tracer token in the following columns: Publisher to Distributor, Distributor to Subscriber, Total Latency. A value of Pending indicates that the token has not reached a given point.

DR-54

This way we are done with Database Replication Setup.

Will come up with more topic soon.

 

Advertisements

Database Replication – Part IV

This post is in continuation to my earlier post. Please check if you missed.

Database Replication – Part I

Database Replication – Part II

Database Replication – Part III

As committed in this post we will continue to cover practical approach, Next step from last post.

Publishing Data Using Transactional Replication

We will create a transactional publication using SQL Server Management Studio to publish a filtered subset of the Item table in the Nav 2018 sample database. We will also add the SQL Server login used by the Distribution Agent to the publication access list (PAL).

To create a publication and define articles

Connect to the Publisher in SQL Server Management Studio, and then expand the server node.

Expand the Replication folder, right-click the Local Publications folder, and click New Publication.

DR-24

The Publication Configuration Wizard launches.

DR-25

On the Publication Database page, select Nav 2018 database, and then click Next.

DR-26

On the Publication Type page, select Transactional publication, and then click Next.

DR-27

On the Articles page, expand the Tables node, select the check box for table CRONOUS International Ltd_$Item (dbo). Click Next.

DR-28

On the Filter Table Rows page, click Add.

In the Add Filter dialog box, click the Replenishment System column, click the right arrow to add the column to the Filter statement WHERE clause of the filter query, and modify the WHERE clause as follows:

WHERE [Replenishment System] = 1

Click OK,

DR-29

Click Next.

DR-30

Select the Create a snapshot immediately and keep the snapshot available to initialize subscriptions check box, and click Next.

DR-31

On the Agent Security page, clear Use the security settings from the Snapshot Agent check box.

Click Security Settings for the Snapshot Agent, enter <Machine_Name>\repl_snapshot in the Process account box, supply the password for this account, and then click OK.

DR-32

Repeat the previous step to set repl_logreader as the process account for the Log Reader Agent

DR-33

Click Finish.

On the Complete the Wizard page, type Nav2018ItemTrans in the Publication name box, and click Finish.

DR-34

After the publication is created, click Close to complete the wizard.

DR-35

To view the status of snapshot generation

  • Connect to the Publisher in SQL Server Management Studio, expand the server node, and then expand the Replication folder.

In the Local Publications folder, right-click Nav2018ItemTrans, and then click View Snapshot Agent Status.

DR-36

The current status of the Snapshot Agent job for the publication is displayed. Verify that the snapshot job has succeeded.

DR-37

To add the Distribution Agent login to the PAL

  • Connect to the Publisher in SQL Server Management Studio, expand the server node, and then expand the Replication folder.
  • In the Local Publications folder, right-click Nav2018ItemTrans, and then click Properties.
  • The Publication Properties dialog box is displayed.
  • Select the Publication Access List page, and click Add.

In the Add Publication Access dialog box, select <Machine_Name>\repl_distribution and click OK. Click OK.

DR-38

 

We will discuss Next step in our upcoming post.

 

New Changes or features for Application Users in Microsoft Dynamics NAV 2017

Although it is too early to discuss on features and capabilities which we are going to get in our new release, we should wait for exact information post release of same.

Here are few extracts from available documents pre-release of the product.

You will find most of the features similar to which we have discussed in our earlier posts for Madeira Preview Project.

 

Incoming documents

New modification to the list of incoming documents. Now it will have filtered view, a new field Processed is added, and now by default filter is applied to show only unprocessed incoming documents in the list.

However you can choose to still view all incoming documents in the list using the show Show All action.

When documents are posted, the processed flag is set to Yes, such that incoming documents that have been processed into posted documents are filtered out.

Capability is added to manually switch the Processed flag.

Now OCR supports for document extracting line details, when we send PDF or image files on incoming documents to Lexmark ICS, the 3rd-party provider of OCR services.

You can now create purchase invoices with multi-line details.

Now you can easily correct OCR errors before receiving the document and at the same time configure the service to avoid the known errors in future.

You can use the new Lexmark ICS online experience to visually train and validate the OCR processing of PDF and image files for incoming documents, such as vendor invoices.

From each incoming document record, you can now link directly to the online extracted document to validate the OCR result and train the service with your corrections.

A new Role Center Cue will give users capability to monitor incoming documents that await your OCR validation.

 

 

Inventory Items

Item attributes similar to one we saw in Madeira.

Capability to add attributes to your inventory items will make it easy for you to find the right product that your customer wants by filtering on attributes.

You can define your own attribute types, such as Base Material, Colour, Size, or other product dimensions, and attach to your items as a supplement to the built-in item attribute types and values.

When you access your items on documents or list, you can view and filter on the attribute values to limit the list of items.

By using Categorize items you can create a hierarchy of item categories and assign item attributes to each item category.

When you add items to a category, they will inherit the default attributes on the category.

This will ensures to have a common set of attributes on related items, will speed up the assignment of attributes to similar items, and allow filtering and navigating related items easily based on the category.

 

 

Smarter sales and purchase documents

Now you can cancel posted sales and purchase credit memos in the same way like you can do for posted sales and purchase invoices.

After performing this action you will find the credit memo is unapplied from the invoice and creates a new invoice that is applied to the credit memo.

This action will bring you to the earlier position where you created the credit memo. (Like Undo)

Package Tracking No. and Shipping Agent Code on sales invoices Posted sales invoices now show the package tracking number and the shipping agent code by default.

 

Account Categories in the Chart of Accounts

A new capability is added as account categories and sub categories.

Now you can group G/L Accounts to adjust your need for financial reporting.

For each G/L account, now you can specify the account category so that when ledger entries are posted to these accounts they are categorized as per your category.

For each account category, you can now choose from different sub-categories, even you can also create new sub-categories as per your need.

For example, for the G/L account category Expense, you can define as many sub-categories as you need so that you can differentiate between Expenses on different sources.

 

 

Payment reconciliation

The Payment Reconciliation journal now will show you total outstanding transactions and outstanding payments.

From there itself you can now look up the list of documents that have not been applied or used in the reconciliation process.

You can now also choose to include outstanding payments and outstanding transactions information in the Payment Reconciliation report.

 

Jobs

Project Manager Role Center

A new project manager field is added to the Job Card in order to give users clear sense of job ownership.

A new list My Jobs list is added on the Project Manager Role Center which will display jobs with the status of Order by default based on the new project manager field as discussed above.

If a person is assigned as the project manager to 5 jobs and those jobs have the On Order status, those 5 jobs will default on the My Jobs list on the Project Manager Role Center.

A new cue is added to the Project Manager Role Center to Create Job Sales Invoice, so now users will have quick access to this functionality.

On the Project Manager Role Center a new drop down list for Job Reports has been added to the Actions ribbon.

Job cost

A new fact box will be available to the right side of the Job Card to display the job’s cost information.

Now the term Contract is renamed as Billable throughout the Jobs module.

On the Job Card the Allow Schedule/Billable lines and Apply Usage Link options are marked by default for new jobs.

A new report called Job Quote is available so user can send a quote to a customer. This report is available as Word Template so it can be modified using Word and emailed to the contact on the job.

 

Fixed Assets

Now setting up fixed assets is simpler. A standard setup is provided, but allows you to modify it at anytime you want.

You can register fixed asset as cards with complete information, accounting details, and reporting classification.

You can also register purchases of fixed assets through special fixed asset G/L journals where you can also dispose of fixed assets. The right accounting entries are created automatically.

 

Simplified Setup for using Dynamics CRM from inside Dynamics NAV

As you are familiar Dynamics NAV includes opportunity management and basic customer relationship management.

You can easily set up a connection with Dynamics CRM to have more benefit from the strong capabilities that Dynamics CRM offers, such as marketing and customer service capabilities.

A new assisted setup guide to the Business Manager home page will be available that will guide you through the setup process. Once you are through with this setup, you’ll have a seamless coupling of Dynamics CRM records with Dynamics NAV records.

 

 

Simplified opportunity management and CRM functionality

You will find Dynamics NAV CRM module has been simplified and improved in a number of ways:

  1. At place of CRM wizard pages now you will have card pages, so that they can now also run in the Web client with improved user experience.
  2. Worksheet pages also been replaced by list pages so now you can also use the pages in browsers.
  3. The Contact card has been simplified by marking a number of controls as Additional, so that the page default looks simpler.
  4. Mail merge have been replaced with Word reporting. Also, new Word reporting capabilities added to generate email body content based on the related report and the selected report layout.
  5. A new Sales & Relationship Manager Role Center has been added

New wizard has been created that helps users set up email logging (default public folders, rules, and job queue setup).

 

Smart notifications give you advice and recommendations

Similar to one which we discussed in our earlier post for Madeira.

 

Extensions

Microsoft provides the following extensions:

  1. Envestnet Yodlee Bank Feeds
  2. PayPal Payments Standard
  3. QuickBooks Data Migration
  4. Sales and Inventory Forecast

Similar to Madeira few of them we have discussed in our earlier posts, rest will discuss in our upcoming posts.

 

Office 365

If your organization uses Office 365, Dynamics NAV 2017 includes an add-in so you can invoice your customers based on entries in your Outlook calendar.

From the Contacts List in Dynamics NAV, you can manually synchronize your Dynamics NAV contacts to Office 365 People.

The contacts from Office 365 will sync back to Dynamics NAV as well.

There is a filter that can be applied to the synchronization process so users will only need to sync the contacts they use most often. This same filter is used during the automatic background sync as well. This synchronization process also works with Microsoft Outlook on the desktop.

We will discuss in more details in our upcoming posts. This feature is too available in Madeira.

 

 

 

US financial reports

In the US version of Dynamics 2017, added new four financial reports to the Business Manager and Accountant Role Centers:

  1. Balance Sheet
  2. Income Statement
  3. Cash Flow Statement, and
  4. Retained Earnings Statement

These reports use G/L account categories and sub-categories discussed above to group financial data.

 

Source:- https://mbs.microsoft.com/Files/partner/NAV/Readiness_Training/ReadinessTrainingNews/WhatsNewDynamicsNAV2017LimitedBeta.pdf

I will come up with more details in my upcoming posts.

 

 

 

How to attach additional 6 Shortcut Dimensions to Dimension Set when copy data to Journal from Excel

 

In my previous post I explained how we can copy data to-and-fro between Excel and Navision. Also we saw the limitation of additional 6 Shortcut Dimensions neither can be exported nor imported back.

Today we will see solution to this limitation, as it is required from several customers and we keep getting request for same. The same reader for whom I have posted my previous post have requested similar need.

If you missed my previous post you can find here : Copying data to-and-fro between Excel & Navision

Let us see how we can get this part working, yes it will require a small customization to get this working. Below I explain the steps we require to achieve this.

My example refers to Table 81 & Page 39. You can do same with other Journals too.

We will Add Custom Fields for these additional 6 Shortcut Dimensions.

 

ExcelToNavJnl-8

Add the Fields and write the above code to OnValidate trigger and set the Property accordingly for all of the 6 Shortcut Dimension Fields.

ExcelToNavJnl-9

Similarly add a piece of code to OnInsert trigger of the Table.

ExcelToNavJnl-10

Add our newly created Fields to Page too.

Now Let us check if it works as expected.

ExcelToNavJnl-11

 

Perform the operation as in previous post, this time add the newly created fields for Dimensions as shown in above screen.

If you missed my previous post you can find here : Copying data to-and-fro between Excel & Navision

Export and Import back by adding value to these additional Dimensions 1 &2 is Global 3-6 is my additional Shortcut Dimensions, 7-8 I have not included as not setup in my database.

After importing back check the Dimensions and you will find due to above customization my all the Shortcut Dimensions are attached to Dimension Set Entry.
ExcelToNavJnl-12

Yes we are done.

I will come up with more information in my upcoming posts. Till then keep exploring and Learning.

 

 

Copying data to-and-fro between Excel & Navision

ExcelToNavJnl

One of my reader has requested to show him how to export data from Nav Journal to Excel, perform correction and import back to Navision.

So let us see how can we perform this and what are the limitations.

Open the Journal in Navision.

Arrange and show all the Fields that you want to export to Excel on the page.

Fill some sample data. Say single line of Journal, way usually you do.

ExcelToNavJnl-2

Now Send to Excel using options shown in below screen.

ExcelToNavJnl-3

Data will get Exported to Excel.

Have you noticed something, with the data that got exported yo Excel?

Your 2 Additional Shortcut dimension was not Exported to Excel. Customer Group Code & Area Code, why?

Since these are not the actual fields in the table and it is calculated on Page level, so you will only be able to export Dim-1 & Dim-2 your Global Dimensions which is available as Field in the Table.

Make sure you enter Dimension Values in Capitals in Excel Columns.

ExcelToNavJnl-4

Now perform required changes to the Journal data.

Insert New Lines, Delete existing Lines or Edit existing Lines.

Make sure you don’t keep more than 30000 to 40000 lines, until this it works fine if more than this either performance is too slow or Navision gets hang while you copy back your data to Navision. Upto 40000 works fine have tested several time. Depending upon your system performance you can decide how much data will be ok for you.

ExcelToNavJnl-5

As we have seen above my 2 additional Dimensions is missing from the exported data. We need to match the columns what we have in our Excel and sequence. So we will hide/remove the additional columns from the Journal to match the sequence from Excel columns before we copy back our data from Excel to Navision.

ExcelToNavJnl-6

Select the Rows in excel containing you data excluding header columns and copy.

Return to your Journal and Paste as shown in above screen.

ExcelToNavJnl-7

Your modified data is imported back to the Journal in Navision.

Now perform the Journal action way you do normally.

 

App for Power BI REST APIs for Streaming Data

In this post we will see how to create app to use the Power BI REST APIs for Streaming Data.

Full documentation: https://powerbi.microsoft.com/documentation/powerbi-service-real-time-streaming/

To run this app follow the steps discussed in my previous post: [Real-Time Dashboard Tile & Streaming Dataset– in Power BI]

Summary as below:

  1. Go to app.powerbi.com
  2. Go to streaming data management page by via new dashboard > Add tile > Custom Streaming Data > manage data
  3. Click “Add streaming dataset”
  4. Select API, then Next, and give your streaming dataset a name
  5. Add a field with name “Customer ID”, type Number
  6. Add a field with name “Customer Name”, type Text
  7. Add a field with name “Sales Value”, type Number
  8. Click “Create”
  9. Copy the “push URL” and paste it as the value of “realTimePushURL” in below app

We will start with new project in Visual Studio.

RealTimeSync-12

Create a new Visual C# Console Application.

Open the Program.cs File and write a code as shown below.

This app Uses the WebRequest sample code as documented here: https://msdn.microsoft.com/en-us/library/debx8sh9(v=vs.110).aspx

RealTimeSync-13

For your easy here is the code of Program.cs below:

 

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Threading.Tasks;

using System.Net;

using System.IO;

 

namespace RealTimeStreaming

{

class Program

{

// Paste your own push URL below as obtained from while creating Streaming Dataset and saved in step 9 above

private static string realTimePushURL = “https://api.powerbi.com/beta/4e7ca966-123e-4ce7-9833-3e858854b98f/datasets/7d154cd7-11e0-4d5e-9569-12fa3f82f224/rows?key=mD1nkJOf426PjPPaEQsW9xEg%2FN1EENQ2hRZvXIpHr%2BTXNk3XQpKsR2Jbe5CATiMoLmxjlzSp%2FIMlbe9HL8G4xQ%3D%3D&#8221;;

static void Main(string[] args)

{

while (true) { //Set Infinite Loop

try

{

// Declare values that we will be sending

Random r = new Random();

int currentValue = r.Next(0, 100);

String Name = “Dummy Name”;

// Send POST request to the push URL

WebRequest request = WebRequest.Create(realTimePushURL);

request.Method = “POST”;

 

//Here you will retrieve the data from the source and format as per the request.

//In this example we are sending Random Value generated by above code for testing purpose.

string postData = String.Format(“[{{ \”Customer ID\”: {0}, \”Sales Value\”:{1} }}]”, currentValue,currentValue);

Console.WriteLine(String.Format(“Making POST request with data: {0}”, postData));

 

// Prepare request for sending

byte[] byteArray = Encoding.UTF8.GetBytes(postData);

request.ContentLength = byteArray.Length;

// Get the request stream.

Stream dataStream = request.GetRequestStream();

// Write the data to the request stream.

dataStream.Write(byteArray, 0, byteArray.Length);

// Close the Stream object.

dataStream.Close();

// Get the response.

WebResponse response = request.GetResponse();

// Display the status.

Console.WriteLine(String.Format(“Service response: {0}”, ((HttpWebResponse)response).StatusCode));

// Get the stream containing content returned by the server.

dataStream = response.GetResponseStream();

// Open the stream using a StreamReader for easy access.

StreamReader reader = new StreamReader(dataStream);

// Read the content.

string responseFromServer = reader.ReadToEnd();

// Display the content.

Console.WriteLine(responseFromServer);

// Clean up the streams.

reader.Close();

dataStream.Close();

response.Close();

}

catch (Exception ex)

{

Console.WriteLine(ex);

}

// Wait 5 second before sending

System.Threading.Thread.Sleep(5000);

} //Infinite Loop ends here.

}

}

}

Compile and Run the Program.

RealTimeSync-14

Leave the Program Running and switch to Power BI dashboard. You will see your newly created Tile in previous post will be displaying the Random Value generated by this program updating every 5 seconds.

That’s all with little tweaking to this program you can fetch your data and send the updated data to your Real Time Streaming Dataset.

That’s end to this post.

I will come up with more details in my upcoming posts.

Till then keep Exploring and learning.

 

 

Real-Time Dashboard Tile & Streaming Dataset– in Power BI

Power BI have introduced real-time dashboard tiles – a lightweight, simple way to get real-time data onto your dashboard. Real-time tiles can be created in minutes by pushing data to the Power BI REST APIs or from streams you’ve created in Azure Stream Analytics or PubNub, a popular real-time streaming service. Let’s see how we can do in no time.

Login to your Power BI using your credentials.

Go to your dashboard where you wish to add Real Time Streaming Tile, choose “Add a tile”

RealTimeSync-1

Select the “Custom streaming data” option

RealTimeSync-2

Click on Next.

RealTimeSync-3

At first usage you may not be having Streaming Dataset, if you have List will be shown.

Let’s create one for our Example, Click the link – Manage Data.

RealTimeSync-4

Click on Add Streaming Dataset.

RealTimeSync-5

From New Streaming Dataset, Select API and click on Next.

RealTimeSync-6

Add your Dataset Name, Fields and Datatypes.

Once you are done, Click on Create.

RealTimeSync-7

Copy your Push URL, we will require this to push data to Data Stream.

Click On Done.

RealTimeSync-8

Returning to our Previous Step, Now we can see Streaming Dataset Available.

Select your Dataset and Click on Next.

RealTimeSync-9

Select the Type of Visualization you want, and Fields to display.

Click on Next.

RealTimeSync-10

Give Title, Subtitle to your Tile and click on Apply.

RealTimeSync-11

Bravo, you are done, your Tile will be added on your Dashboard.

But hold on, the Value will not come until you add logic to push data to the Tile.

Now we will move to our Next Step, where we will create a program to call Power BI REST API and push Streaming data to our Dashboard. Checkout my next post for same. [App for Power BI REST APIs for Streaming Data]

Till then keep Exploring & Learning. I will return soon with my next post.

 

Save General Journals as Standard Journal in Navision

Do you Know you can save the General Journal as Standard Journal and retrive edit and post it later.

Many of time we pass entries of same Nature, Like Rent, Salary, Bills etc.. just an example you can decide upon which entries you post frequently and club it as Standard Journal to save your time.

You can create once and retrive it every month or when ever required and Post it in future/later.This can serve you as template for future use.

Let us see how we can use this feature.

StdGen-1

Open your General Journal.

Make entries which you do oftenly. Don’t put Postng Date & Document No. Complete your all entries.

Next we will save this as Standard Journal for future use.

From Action choose Save as Standard Journal.

StdGen-2

Give Code & Name for your Journal Template.

If Amounts are fixed every time you can enter them too and while Save choose Save Amount.

No Let us review the entry that we saved.

From Action Choose Get Standard Journal.

This will display you all Saved Standard General Journals.

StdGen-3

Select the required Journal Code and Click on Edit.

StdGen-4

This is what is Saved in your Standard Journal.

Let use return to the Standard Journal List.

Select desired/ earlier saved Std. Journal and Click on OK.

StdGen-5

Confirmation of Std. Journal retrived and Journals created in General Journal Page.

StdGen-6

Your General Journal gets populated with values you saved to Std. Journal.

Have you Noticed, Posting Date as your Work Date and Next Document No. from your assigned No. Series is populated Automatically.

StdGen-7

Fill the Amount and any other required fields and Post the Journal.

Next time when again you wish to post similar entries just retrive it and continue.

Thats all for todays Post, What you are waiting for give it a try.

I will comeup with more details in my upcoming Posts, till then keep exploring and Learning.

 

 

Creating Views – to Save Filtered Lists in NAV

Do you Know you can save the Page as View with filters.

Many of time we access to Page and apply same set of Filters to fetch our data. You can save the same for quick retrival of data applying the filters automatically in future access.

You save the Page with Filters as View.

Let us see small example how to achieve it.

First we will open the List Page in my case i am having one customized Customer List.

FilteredView-1

Suppose this is the Page which i use for Balances Retrival each month and on daily basis.

I Enter Filter as Balance > 5000 & Current Month Filter whenever i wish to check balances for my reporting or followup purpose.

For Date Filter i use (-CM..CM) sothat when i open this view next month it takes the appropriate Filter for that month.

FilteredView-2

Click on the Page Name post applying all the required Filters.

Select Save View as.

Give the Name for the View.

Select the Location to save the View.

On Confirmation respond as Yes.

FilteredView-3

Your View will be Saved, You can create as much Views as desired using different Filters.

FilteredView-4

Small and good Feature.

What you are waiting for give it a try.

I will comeup with more information in my upcoming posts, till then keep exploring and learning.

 

 

Plug & Play – Power BI and Jet Enterprise in NAV 2016

This is where Power BI and Jet Enterprise really shine – they have the ability to understand NAV because they have little bundles of interpretation packs that can interpret NAV for you!

Checkout more via below shared link :-

http://www.catapulterp.com/plug-and-play-power-bi-and-jet-enterprise-in-nav-2016/

Exposing & Consuming the Web service from & inside Navision – Part-2

Continuing from my previous post. Find the link to access the same in case you missed it.

Exposing & Consuming the Web service from & inside Navision – Part-1

 

Step 1: Create a XML Port to Export Data in XML Format using XMLPort and save at some specified Location.

Here we will be creating just for demo purpose simple XMLPort for Exporting Data from Navision Database in XML Format.

WebServiceUsage-2

We are using very simple structure for our XMLPort.

Make sure you set the properties accordingly as we are exporting the data in XML Format.

WebServiceUsage-3

Save the XMLPort at available id and suitable name.

 

Step 2: Create a Codeunit to call XML Port and retrieve data in XML Format in Text Variable.

Here we will create a Codeunit with few helpful functions which will call the XMLPort to Export the data and save in temp file.

Further retrieve the data from the file in XML Format.

Return the data retrieved above to the calling program/ function.

To start with let us start with creating some Global Variables.

WebServiceUsage-4

Next we will add one Function as below.

WebServiceUsage-5

Next we will add one more Function as below.

WebServiceUsage-6

Save the Codeunit at available Id and suitable Name.

 

Step 3: Expose/Publish the Codeunit as web service.

Open Web Services Page and add your Codeunit created above. Use your relevant ID on which you saved your Codeunit.

WebServiceUsage-7

 

Tick the Published to Expose the Codeunit as Web Service.

Rest in my Next post.

To Be Continued……

Exposing & Consuming the Web service from & inside Navision – Part-1

Introduction

Today in reply to one of my reader I am giving here steps how we can Export data using XMLPort and transfer data outside Navision using Web Services.

Lets discuss about the technology before I jump to the solution.

Few full forms which may help understanding while reading this post.

W3C – Web Services Architecture Working Group.

SOAP – Simple Object Access Protocol.

REST – Representational State Transfer.

WSDL – Web Service Description Language.

UDDI – Universal Description, Discovery & Integration.

AWS – Arbitrary Web Services.

XML – Extensible Markup Language.

HTTP – Hypertext Transfer Protocol.

 

What is Web Service?

W3C defines Web Service as : A software system designed to support interoperable Machine-to-Machine interaction over a network.

Web service can be defined as service offered by an electronic devices to another electronic devices, which communicate with each other over www.

W3C defines web services has an interface described in a machine-process able format WSDL.

Other systems I with the web service in a manner prescribed by its description using SOAP messages, typically conveyed using HTTP with an XML serialization in conjunction with other web-related standards.

Later W3C extended the definition:-

REST – compliant web services, in which the primary purpose of the service is to manipulate representations of web resources using a uniform set of stateless operations.

Arbitrary Web Services –  the services may expose an arbitrary set of operations.

WebServiceUsage-1-2

UDDI  defines which software system need to be contacted for which type of data.

Once the system is discovered to be contacted, the contact is established using SOAP.

Service provider system validate the request by referring the WSDL file, process the request and send the data using SOAP protocol.

 

Web Service is a method of communication that allows two software system to exchange the data over the internet.

  • Two systems may use different software which could be using different programming languages.
  • Almost all type of software interpret XML tags.
  • Web Services can be used to exchange data between two systems in form XML files.

WebServiceUsage-1-3

XML is used to tag the data.

SOAP is used to transfer the data.

WSDL is used to describe the services available.

UDDI lists what service are available.

Human to machine communication is utilized for machine-to-machine communication, used for transferring machine readable files such as XML & JSON formats.

Web services use SOAP over HTTP protocol, facilitating to use your existing low-cost internet for implementing Web Service.

Rules for communication between two different systems:

All the rules are defined in WSDL file.

  • How request to be send from one system to another.
  • What all parameters need to be send while requesting data.
  • What structure to be used for generating data (Normally data is exchanged in XML format) which is validated against .xsd file.
  • What errors to be generated in case of failure of rule communication set.

 

Web Services that uses Markup Language:

  • JSON-RPC
  • JSON-WSP
  • Web template
  • Web Service Description Language (WSDL)
  • XML Interface for Network Services (XINS)
  • Web Services Conversation Language (WSCL)
  • Web Services Flow Language (WSFL)
  • WS-Meta data Exchange
  • Representational State Transfer (REST)
  • Remote Procedure Call (RPC)
  • XML Remote Procedure Call (XML-RPC)

 

No more theories lets come to our point.

There are other ways too to achieve the same, but here I am specific to the requirement to which I am answering.

Requirement is the Data in XML format file will be transmitted using Web Service, We will be using Navision Exposed Web Service to Export data using XMLPort and then send the content by consuming another Web Service exposed in Dot Net which will retrieve the XML content and update in some other system.

Assumption is these two endpoints can be two separate databases in same domain or two different companies of same organization operating at two different location sharing information via Web Services.

For the demo purpose I will be using Navision to publish the service for exporting the data and retrieve the data using Web Service exposed in Dot Net.

Since I have not in depth knowledge of Dot Net codes, so I have asked my college to prepare the Web Service for me and I am simply using it to show how we can do it.

Here I will be getting this done through below 5 Steps.

WebServiceUsage-1

Next Post in this Series will contain:

Step 1: Create a XML Port to Export Data in XML Format using XMLPort and save at some specified Location.

Step 2: Create a Codeunit to call XML Port and retrieve data in XML Format in Text Variable.

Step 3: Expose/Publish the Codeunit as web service.

 

Next Post in this Series will contain:

Step 4: Create a DLL to use as Add-in to consume the Web Service itself in Navision.

 

And the Final Post in this Series will contain:

Step 5: Create a Codeunit to test Retrieve the data using above DLL and Send data in XML Format using another Web Service exposed in Dot Net.

 

Please follow my upcoming post for Walkthrough to get above 5 steps done.

To Be Continued……….

 

How to Detect Duplicates

Until Dynamics NAV 2013, the only possibility was to iterate through the table in a loop and then create a sub-loop where another instance of the same table is filtered to check for duplicates.

For example, to check for duplicate names in the Contact table, the code would look like this:

Duplicate.-1

Dynamics NAV 2013 onwards, we can use queries to create a more efficient implementation of the same logic.

 The solution involves that you create a query to return duplicates, and then invoke it from a method that would test the value of the query to identify if duplicates were found.

Step 1 – Creating the Query

  • The query must be created with the table we want to search in as the dataitem.
  • The field we want to search for must be created as a grouped field.
  • Create a totaling field on the count, and add a filter for Count > 1. This ensures that only records with more than one instance of the field that we selected in the previous step are included in the query result.
  • Continuing with our Contact Name example, here is how the query would look:

Duplicate.-2

Step 2 – Invoking the Query to Check for Duplicates

Now that the query is created, all we need to do is to invoke the query and check if any records are returned, which would mean that there are duplicates.

Here is an alternate implementation of the FindDuplicateContact method using the query that we created:

Duplicate.-3

 

Cost Components

Cost components are different types of costs that make up the value of an inventory increase or decrease.

The below table shows the different cost components and any subordinate cost components that they consist of.

Cost component Subordinate cost component Description
Direct cost Unit cost (direct purchase price) Cost that can be traced to a cost object.
Freight cost (item charge)
Insurance cost (item charge)
Indirect cost Cost that cannot be traced to a cost object.
Variance Purchase variance The difference between actual and standard costs, which is only posted for items using the Standard costing method.
Material variance
Capacity variance
Subcontracted variance
Capacity overhead variance
Manufacturing overhead variance
Revaluation A depreciation or appreciation of the current inventory value.
Rounding Residuals caused by the way in which valuation of inventory decreases are calculated.

Freight and insurance costs are item charges that can be added to an item’s cost at any time. When you run the Adjust Cost – Item Entries batch job, the value of any related inventory decreases are updated accordingly.

Accounts in the General Ledger

To reconcile inventory and capacity ledger entries with the general ledger, the related value entries are posted to different accounts in the general ledger.

From the Inventory Ledger

The below table shows the relationship between different types of inventory value entries and the accounts and balancing accounts in the general ledger.

Value entry General ledger accounts
Item ledger entry type Value entry type Variance type Expected cost Account Balancing account
Purchase Direct Cost Yes Inventory (Interim) Invt. Accrual Acc. (Interim)
Direct Cost No Inventory Direct Cost Applied
Indirect Cost No Inventory Overhead Applied
Variance Purchase No Inventory Purchase Variance
Revaluation No Inventory Inventory Adjmt.
Rounding No Inventory Inventory Adjmt.
Sale Direct Cost Yes Inventory (Interim) COGS (Interim)
Direct Cost No Inventory COGS
Revaluation No Inventory Inventory Adjmt.
Rounding No Inventory Inventory Adjmt.
Positive Adjmt.,Negative Adjmt., Transfer Direct Cost No Inventory Inventory Adjmt.
Revaluation No Inventory Inventory Adjmt.
Rounding No Inventory Inventory Adjmt.
(Production) Consumption Direct Cost No Inventory WIP
Revaluation No Inventory Inventory Adjmt.
Rounding No Inventory Inventory Adjmt.
Assembly Consumption Direct Cost No Inventory Inventory Adjmt.
Direct Cost No Direct Cost Applied Inventory Adjmt.
Indirect Cost No Overhead Applied Inventory Adjmt.
(Production) Output Direct Cost Yes Inventory (Interim) WIP
Direct Cost No Inventory WIP
Indirect Cost No Inventory Overhead Applied
Variance Material No Inventory Material Variance
Variance Capacity No Inventory Capacity Variance
Variance Subcontracted No Inventory Subcontracted Variance
Variance Capacity Overhead No Inventory Cap. Overhead Variance
Variance Manufacturing Overhead No Inventory Mfg. Overhead Variance
Revaluation No Inventory Inventory Adjmt.
Rounding No Inventory Inventory Adjmt.
Assembly Output Direct Cost No Inventory Inventory Adjmt.
Revaluation No Inventory Inventory Adjmt.
Indirect Cost No Inventory Overhead Applied
Variance Material No Inventory Material Variance
Variance Capacity No Inventory Capacity Variance
Variance Capacity Overhead No Inventory Cap. Overhead Variance
Variance Manufacturing Overhead No Inventory Mfg. Overhead Variance
Rounding No Inventory Inventory Adjmt.

From the Capacity Ledger

The below table shows the relationship between different types of capacity value entries and the accounts and balancing accounts in the general ledger. Capacity ledger entries represent labor time consumed in assembly or production work.

Value entry General ledger accounts
Work type Capacity ledger entry type Value entry type Account Balancing account
Assembly Resource Direct Cost Direct Cost Applied Inventory Adjmt.
Assembly Resource Indirect Cost Overhead Applied Inventory Adjmt.
Production Machine Center/Work Center Direct Cost WIP Account Direct Cost Applied
Production Machine Center/Work Center Indirect Cost WIP Account Overhead Applied

Assembly Costs are Always Actual

As shown in the table above, assembly postings are not represented in interim accounts. This is because the concept of work in process (WIP) does not apply in assembly output posting, unlike in production output posting. Assembly costs are only posted as actual cost, never as expected cost.

Calculating the Amount to Post to the General Ledger

The following fields in the Value Entry table are used to calculate the expected cost amount that is posted to the general ledger:

  • Cost Amount (Actual)
  • Cost Posted to G/L
  • Cost Amount (Expected)
  • Expected Cost Posted to G/L

The following table shows how the amounts to post to the general ledger are calculated for the two different cost types.

Cost type Calculation
Actual Cost Cost Amount (Actual) Cost Posted to G/L
Expected Cost Cost Amount (Expected) Expected Cost Posted to G/L

 

Few Helpful PowerShell Commands which you can use for Upgrade Process in Navision 2016 – Part 2

You can use Windows PowerShell scripts to upgrade the latest version of Microsoft Dynamics NAV. Microsoft Dynamics NAV 2016 provides sample scripts that you can adapt for your deployment architecture.

Automating the Upgrade Process

When you upgrade to Microsoft Dynamics NAV 2016, you must first upgrade the application code, and then you upgrade the data.

In my earlier post I have explained this using PowerShell commands, you can find the link here: Helpful PowerShell Commands which you can use for Upgrade Process in Navision 2016

By using Windows PowerShell, you can automate both parts of the upgrade process. Also, you can use the same scripts to test each step in your upgrade process before you upgrade production databases.

You can combine this automated upgrade with a migration to multitenancy this makes maintenance easier for you.

The Sample Scripts for Code Upgrade

Microsoft Dynamics NAV includes sample scripts that illustrate how you can use Windows PowerShell cmdlets to upgrade your application to the latest version of Microsoft Dynamics NAV.

The sample scripts are located in the ApplicationMergeUtilities folder under the WindowsPowerShellScripts folder on the Microsoft Dynamics NAV product media.

However you can follow above post link steps explanation to get it done.

The Sample Scripts for Data Upgrade

Microsoft Dynamics NAV includes sample scripts that illustrate how you can automate the upgrade of data to the latest version of Microsoft Dynamics NAV.

The sample scripts are located in the Upgrade folder under the WindowsPowerShellScripts folder on the Microsoft Dynamics NAV product media. You can run the sample script using a partner license or a customer license.
PowerShell-3

To learn and follow MS suggested steps you can find details using this link. Automating the Upgrade Process using Windows PowerShell Scripts in Microsoft Dynamics NAV 2016

To run the sample script for the data upgrade of a Microsoft Dynamics NAV database, you must have a Microsoft Dynamics NAV 2013, Microsoft Dynamics NAV 2013 R2, or Microsoft Dynamics NAV 2015 database that is available on a SQL Server instance and is ready to be upgraded.

Here I present my version derived from above Steps:

To continue we will do some setup. Copy the Upgrade folder from above path and save as DataUpgradePSKit.

PowerShell-4

Create Folder OriginalScript and move the PS1 file on root to this folder although we don’t require for this exercise but you can safe copy for your reference. (Example, Set-PartnerSettings, Set-PowerShellEnvironment)

Create Backup folder, script will use to store backup of the database previous to start Upgrade process.

Create Upgrade Folder and place these files:

  • License File
  • New Merged Objects fob
  • Upgrade Toolkit / or your own prepared Upgrade Codeunits

PowerShell-5

  • Create ProcessLogs Folder, which will be used for recording log of Shell Script.

Here is the script which we will be using to perform our Data Upgrade process:

You can find this script here http://1drv.ms/1NyolVV or you can download from Menu of my Blog using Link Shared Files.

 

# Added below parameter values globally for ease of maintenance

# You just do correction on values here (as per your environment) and will be in effect for rest of below script

# No need to scan and change every occurrence for same value in different steps of the script.

# Select this section and Execute first so that these Variables value are available for rest of the script.

Import-Module ‘C:\Program Files\Microsoft Dynamics NAV\90\Service\NavAdminTool.ps1’

$NAVUpgrade_NAVServerInstance = “UpgradedDBfrom2013R2”

$NAVUpgrade_NAVServerServiceAccount = “NT AUTHORITY\NETWORK SERVICE”

$NAVUpgrade_FinSqlExeFile = “C:\Program Files (x86)\Microsoft Dynamics NAV\90\RoleTailored Client\finsql.exe”

$NAVUpgrade_IDEModulePath = “”

$NAVUpgrade_DatabaseServer = “INDEL-AXT5283VM”

$NAVUpgrade_DatabaseInstance = “”

$NAVUpgrade_DatabaseName = “Demo Database NAV (7-1)”

$NAVUpgrade_DatabaseToUpgradeBakFile = “C:\UserData\DataUpgradePSKit\Backup\DynamicsNAV70_BeforeUpgrade.bak”

$NAVUpgrade_NewVersionObjectsFobFilePath = “C:\UserData\DataUpgradePSKit\Upgrade\NewObjects.fob”

$NAVUpgrade_UpgradeToolkitObjectsFobFilePath = “C:\UserData\DataUpgradePSKit\Upgrade\Upgrade710900.FOB”

$NAVUpgrade_UpgradeObjectsFilter = “Version List=UPGTK9.00.00”

$NAVUpgrade_UpgradeLogsDirectory = “C:\UserData\DataUpgradePSKit\Upgrade\ProcessLogs”

#$NAVUpgrade_RapidStartPackageFile = ‘C:\UserData\DataUpgradePSKit\Upgrade\PackageSTCODES.rapidstart’

$NAVUpgrade_CurrentVersionLicenseFile = “C:\UserData\DataUpgradePSKit\Upgrade\DevLicense.flf”

$NAVUpgrade_PreviousVersionLicenseFilePath = “C:\UserData\DataUpgradePSKit\Upgrade\DevLicense.flf”

 

# Upgrade Steps:

Import-Module (Join-Path (Get-Location) ‘Cmdlets\NAVUpgradeCmdlets.psm1’) -DisableNameChecking

#1. Prepares the Windows PowerShell session by importing the required modules.

        # Import the NAV IDE Module.

Import-NAVIdeModule -IDEModuleSuggestedPath $NAVUpgrade_IDEModulePath -FinSqlExeFile $NAVUpgrade_FinSqlExeFile

Import-NAVManagementModule

Import-SqlPsModule

 

#2. Saves the current license from the Microsoft Dynamics NAV 2013, Microsoft Dynamics NAV 2013 R2, or Microsoft Dynamics NAV 2015 database.

# Backup current license from the application part of the database (table ‘$ndo$dbproperty’) , if it exists

        Export-NAVLicenseFromApplicationDatabase `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseServer `

-DatabaseInstance $NAVUpgrade_DatabaseInstance `

-LicenseFilePath $NAVUpgrade_PreviousVersionLicenseFilePath

 

#3. Creates a backup of the Microsoft Dynamics NAV 2013, Microsoft Dynamics NAV 2013 R2, or Microsoft Dynamics NAV 2015 database, and then converts the database to Microsoft Dynamics NAV 2016.

        Backup-NAVSqlDatabase `

-DatabaseServer $NAVUpgrade_DatabaseServer `

-DatabaseInstance $NAVUpgrade_DatabaseInstance `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseBackupFilePath $NAVUpgrade_DatabaseToUpgradeBakFile

$NAVUpgrade_DatabaseSQLServerInstance = Get-SqlServerInstance -DatabaseServer $NAVUpgrade_DatabaseServer -DatabaseInstance $NAVUpgrade_DatabaseInstance

$NavServerInfo = New-Object PSObject

Add-Member -InputObject $NavServerInfo -MemberType NoteProperty -Name NavServerName -Value “$NAVUpgrade_DatabaseServer”

Add-Member -InputObject $NavServerInfo -MemberType NoteProperty -Name NavServerInstance -Value (Get-NAVServerConfigurationValue  -ServerInstance $NAVUpgrade_NAVServerInstance -ConfigKeyName “ServerInstance”)

Add-Member -InputObject $NavServerInfo -MemberType NoteProperty -Name NavServerManagementPort -Value (Get-NAVServerConfigurationValue -ServerInstance $NAVUpgrade_NAVServerInstance -ConfigKeyName “ManagementServicesPort”)

 

# Perform technical upgrade of the NAV database

        Invoke-NAVDatabaseConversion `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory\”Database Conversion”

 

#4. Connects the Microsoft Dynamics NAV 2016 Server instance to the converted database, imports the Microsoft Dynamics NAV 2016 license file, and then synchronizes the table schema.

 

# Connect the NAV Server to the NAV database

        Connect-NAVServerToNAVDatabase  `

-NAVServerInstance $NAVUpgrade_NAVServerInstance `

-NAVServerServiceAccount $NAVUpgrade_NAVServerServiceAccount `

-DatabaseServer $NAVUpgrade_DatabaseServer `

-DatabaseInstance $NAVUpgrade_DatabaseInstance `

-DatabaseName $NAVUpgrade_DatabaseName

# Import the new version license into the application database, and restart the server in order for the license to be loaded

        Import-NAVServerLicense -ServerInstance $NAVUpgrade_NAVServerInstance -LicenseFile $NAVUpgrade_CurrentVersionLicenseFile -Database NavDatabase

Set-NAVServerInstance -ServerInstance $NAVUpgrade_NAVServerInstance -Restart

# Synchronize the NAV database

        Sync-NAVTenant -ServerInstance $NAVUpgrade_NAVServerInstance -Mode Sync -Force

 

#5. Imports the application objects and upgrade toolkit objects from the specified .fob file, and then synchronizes the table schema again.

#   This updates the SQL Server database based on the new table schema that is defined by the imported application objects. Data that must be mapped to another table is saved in upgrade tables.

# Delete the tables from the previous version, using SynchronizeSchemaChanges Later.

# The new  objects we import will contain the new version of the tables.

        Delete-NAVApplicationObject `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory `

-Filter “Type=Table;ID=<2000000000” `

-SynchronizeSchemaChanges “No” `

-NavServerName $NavServerInfo.NavServerName `

-NavServerInstance $NAVServerInfo.NavServerInstance `

-NavServerManagementPort $NavServerInfo.NavServerManagementPort `

-Confirm:$false

# Import all the new objects and the upgrade objects, by delaying the schema synchronization

# If an $UpgradeToolkitObjects value has not been provided, then

#  the assumption is that the upgrade toolkit is within the same .FOB as the new objects

           if(!$UpgradeToolkitObjects)

{

# Import FOB file containing the new version of the application objects, including the upgrade toolkit

Import-NAVApplicationObject `

-Path $NAVUpgrade_NewVersionObjectsFobFilePath `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory `

-ImportAction “Overwrite” `

-SynchronizeSchemaChanges “No” `

-NavServerName $NavServerInfo.NavServerName `

-NavServerInstance $NAVServerInfo.NavServerInstance `

-NavServerManagementPort $NavServerInfo.NavServerManagementPort `

-Confirm:$false

}

else

{

 

# Import FOB file containing the new version of the application objects

Import-NAVApplicationObject `

-Path $NAVUpgrade_NewVersionObjectsFobFilePath `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory `

-ImportAction “Overwrite” `

-SynchronizeSchemaChanges “No” `

-Confirm:$false

 

# Import FOB file containing the upgrade codeunit and upgrade tables

Import-NAVApplicationObject `

-Path $NAVUpgrade_UpgradeToolkitObjectsFobFilePath `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory `

-ImportAction “Overwrite” `

-SynchronizeSchemaChanges “No” `

-Confirm:$false

}

# Synchronize the metadata changes to SQL

        Sync-NAVTenant -ServerInstance $NAVUpgrade_NAVServerInstance -Mode Sync -Force

#6. Calls the Start-NAVDataUpgrade cmdlet to verify the data upgrade preconditions and transfer data from the upgrade tables to the destination tables.

# Invoke the Data Upgrade process

        Invoke-NAVDataUpgrade -ServerInstance $NAVUpgrade_NAVServerInstance

 

#7. Deletes all obsolete tables and the upgrade toolkit objects.

# Delete Upgrade Toolkit objects

        Delete-NAVApplicationObject `

-DatabaseName $NAVUpgrade_DatabaseName `

-DatabaseServer $NAVUpgrade_DatabaseSQLServerInstance `

-LogPath $NAVUpgrade_UpgradeLogsDirectory `

-Filter “$NAVUpgrade_UpgradeObjectsFilter;ID=<2000000000” `

-SynchronizeSchemaChanges “Force” `

-NavServerName $NavServerInfo.NavServerName `

-NavServerInstance $NAVServerInfo.NavServerInstance `

-NavServerManagementPort $NavServerInfo.NavServerManagementPort `

-Confirm:$false

 

#8.       Initializes all companies in the upgraded database. If you specified a RapidStart package in the Set-PartnerSettings.ps1 file, the package is applied to all companies.

# Optionally, run RapidStart package import

        if($NAVUpgrade_RapidStartPackageFile)

{

Invoke-NAVRapidStartDataImport -ServerInstance      $NAVUpgrade_NAVServerInstance -RapidStartPackageFile $NAVUpgrade_RapidStartPackageFile

 

}

 

The sample script is intended to be run in the context of a Microsoft Dynamics NAV 2016 deployment, including the Microsoft Dynamics NAV Server instance.

The Microsoft Dynamics NAV Server instance cannot be multitenant. When the sample script runs successfully, the result is a Microsoft Dynamics NAV 2016 database that is connected to a Microsoft Dynamics NAV 2016 Server instance, and which uses a Microsoft Dynamics NAV 2016 license.

You may face some permission related issues, take help of you IT person if not sure about the nature of issue or use Administrator login.

I will come with more details in my next posts.

 

 

Upgrading the Data in Navision 2016

Continuing from my earlier post Upgrading the Application Code in Microsoft Dynamics NAV 2016

At end of previous post we have imported the objects and compiled also resolved any conflicts and compilation error. In the same process we have got the list of objects which were having destructive table schema, means the objects which will be having the changes due to which we could lose some data. Scan the objects and if we wish to save those data then we will be requiring Data Upgrade codeunits to handle any such situation.

How to create you can refer to my earlier post Data Upgrade – in Navision 2015 this it still valid for 2016.

Now we have all the upgraded application objects and Data Upgrade codeunits if any required for the upgrade. So we are good to go with Upgrading Data for Old database.

We will follow below steps to continue, make sure you have followed the process for preparation/ converting of database as discussed in my earlier post Upgrade in Microsoft Dynamics NAV 2016

Import the application objects to the converted database

In the development environment, import all the application objects that you extracted in previous step as in my earlier post discussed above in the Microsoft Dynamics NAV 2016 database. This includes the FOB file that contains all the Microsoft Dynamics NAV 2016 objects from the application code upgrade and upgrade toolkit objects if any.

When you import the FOB file, if you experience metadata conflicts, use the Import Worksheet to handle these conflicts.

Finally, on the dialog box for selecting the schema synchronization, set the Synchronize Schema option to Later.
Upgrade2016-29

If the upgrade toolkit objects are stored in a separate FOB file, then import the upgrade toolkit FOB file after the application objects are imported.

 

Run the schema synchronization to synchronize the new tables

To publish the data schema changes of the newly imported tables to the SQL tables, run the Sync. Schema For All Tables – With Validation option from the development environment.
Upgrade2016-30

If you are confident that any loss of data you are ok with same you can choose Force option which will drop the data for deleted fields, else use Upgrade Codeunit prepared for same as discussed above.

Or alternatively run the Sync-NavTenant cmdlet from the Microsoft Dynamics NAV 2016 Administration Shell.

Sync-NAVTenant –ServerInstance UpgradedDBfrom2013R2 (My Server Instance Name)

Upgrade2016-31

Note this command runs in Administration Shell not in Development Shell as we used in our previous post.

Run the data upgrade process

A data upgrade runs the upgrade toolkit objects, such as upgrade codeunits and upgrade tables, to migrate business data from the old table structure to the new table structure. You can start the data upgrade from the Microsoft Dynamics NAV Development Environment.
Upgrade2016-32

Upgrade2016-33

Upgrade2016-34

Upgrade2016-35

Ooops………….

Or Alternatively Microsoft Dynamics NAV 2016 Administration Shell.

In the last phase of data upgrade, all companies will be initialized by running codeunit 2 Company Initialization. This is done automatically.

If you want to skip company initialization, then use the Start- NavDataUpgrade cmdlet and set the SkipCompanyIntitialization parameter.

Syntax:

Start-NAVDataUpgrade [-ServerInstance] <String> [[-Tenant] <TenantId> ] [[-FunctionExecutionMode] <FunctionExecutionModeValue> ] [[-ContinueOnError]] [-Force] [-Confirm] [-WhatIf] [ <CommonParameters>]

Parameters
-ContinueOnError

Specifies whether the Microsoft Dynamics NAV Server instance continues to execute other upgrade functions when an error occurs while executing an upgrade function.

If you do not set this parameter, then when an error occurs, the Microsoft Dynamics NAV Server instance will suspend the data upgrade process. It will cancel the execution of upgrade functions currently in progress and roll back any changes that were applied. Completed functions will not be rolled back.

The process remains in suspended state until you take one of the following actions:

– Fix the problems in the upgrade functions that failed, and then resume the process by using the Resume-NAVDataUpgrade cmdlet. You should not add new upgrade functions at this time because they will be ignored when you resume the process.

– Stop the data upgrade process by using the Stop-NAVDataUpgrade cmdlet. Stopping the process will not roll back changes made by upgrade functions that have already been executed.

If you set this parameter, then when an error occurs, the Microsoft Dynamics NAV Server instance will continue executing other upgrade functions. At the end of the process, you can use the Get-NAVDataUpgrade cmdlet to see the list of failed upgrade functions. Changes that were applied by completed functions will not be rolled back.

When upgrading a large database, you should increase the SQL Command Timeout setting for the Microsoft Dynamics NAV Server instance that connects to the database to avoid timeouts during schema synchronization. The default setting is 30 minutes

-ServerInstance<String>

Specifies the Microsoft Dynamics NAV Server instance that the application database and the tenant database are mounted against, such as DynamicsNAV90.

You must include the name within single quotation marks.

-Tenant<TenantId>

Specifies the ID of the tenant that you want to synchronize with the application, such as Tenant1.

This parameter is required unless the specified service instance is not configured to run multiple tenants.

-Force

Forces the command to run without asking for user confirmation.
-FunctionExecutionMode<FunctionExecutionModeValue>
Specifies whether the Microsoft Dynamics NAV Server instance executes upgrade functions in series or parallel.
-Confirm
Prompts you for confirmation before running the cmdlet.

-WhatIf

Shows what would happen if the cmdlet runs. The cmdlet is not run.

Few Examples of Usage:

[1] PS C:\> Start-NAVDataUpgrade -ServerInstance DynamicsNAV90 –Force

[2] PS C:\> Start-NAVDataUpgrade -ServerInstance DynamicsNAV90 -ContinueOnError –Force

[3] PS C:\> Start-NAVDataUpgrade -ServerInstance DynamicsNAV90 -FunctionExecutionMode Serial –Force
Upgrade2016-36

Start-NAVDataUpgrade -ServerInstance UpgradedDBfrom2013R2 –Force

Ooops………….

Now what to do?

To learn how to Create Data Upgrade Codeunits you can see my earlier posts:

Data Upgrade Codeunit in Navision 2015 – Part -1

Data Upgrade Codeunit in Navision 2015 – Part -2

These posts are still valid for 2016, you can follow to get you task done at this step resolution to above error.

To resolve above issue I have followed the instruction as suggested in error message, but in real scenario you will definitely will be having such codeunits. Since I have taken the 2013-R2 Std. Database I am having this issue. I have created on Upgrade codeunit with an empty upgrade function as below:
Upgrade2016-37

Let us run the above process again.
Upgrade2016-38

This time I was able to complete the process successfully.

Delete the upgrade objects

At this point, you have upgraded the database to Microsoft Dynamics NAV 2016. Now, you can delete the upgrade codeunits and upgrade table objects that you imported in above step.

When you delete tables, on the Delete dialog box, set the Synchronize Schema option to Force.

Import upgraded permission sets and permissions by using the Roles and Permissions XMLports

You import the permission sets and permissions XML files according to the following procedure.

To import the permission sets and permissions

  • Delete all permission sets in the database except the SUPER permission set.

In Object Designer, run page 9802 Permission Sets, and then delete the permission sets.

  • Run XMLport 9171 Import/Export Permission Sets to import the permission sets XML file,

In the request page for the XMLport, in the Direction field, choose Import, choose the OK button, and then specify the permission sets XML file.

  • Run XMLport 9172 Import/Export Permissions to import the permission XML file.

In the request page for the XMLport, in the Direction field, choose Import, choose the OK button, and then specify the permissions XML file.

Set the language of the customer database

In the development environment, choose Tools, choose Language, and then select the language of the original customer database.

Add new control add-ins

The database is now fully upgraded and is ready for use. However, you may want to add the new client control add-ins that are included in Microsoft Dynamics NAV 2016. These are not added by the upgrade process. The following client control add-ins are available from the Microsoft Dynamics NAV product media:

  • Microsoft.Dynamics.Nav.Client.BusinessChart
  • Microsoft.Dynamics.Nav.Client.PageReady
  • Microsoft.Dynamics.Nav.Client.PingPong
  • Microsoft.Dynamics.Nav.Client.VideoPlayer
  • Microsoft.Dynamics.Nav.Client.SocialListening

You can add control add-ins in the Control Add-ins window in the Microsoft Dynamics NAV Windows client.

I will come up with more details on this topic in my upcoming posts.

Important

Most of the contents you find in this blog will be either inherited from MSDN or Navision Developer IT Pro Help. Some places images are also directly taken from these sites. Purpose is simple to try those stuffs and re-produce adding few things as per my understanding to make easy understanding for others and quick reference.

Here nothing under my own brand or authorship of the content. At any point of time we are just promoting Microsoft stuffs nothing personnel with same.

Hope stuffs used here will not violate any copyright agreement with them. In case by mistake or in-intestinally it happens and the Microsoft feels these should not be used Microsoft have full right to inform me about same and will be glad to take down any such content which may be violating the norms.

Purpose is to promote Navision and share with community.

FB Profile

Like FB Page

%d bloggers like this: