Quantcast
Channel: Waldo's Blog
Viewing all 339 articles
Browse latest View live

How I upgraded to Microsoft Dynamics NAV 2017

$
0
0

We recently upgraded successfully our product to NAV2017. In fact, one day after the actual availability of the Belgian download, we had our upgraded product ready.

As promised, here is a short post about the challenges we faced during the upgrade. Probably not all challenges are represented here .. and I might have had issues that you won’t have :-). Keep that in mind.

The Scope

First step for me always is an “upgrade” in the pure sense of the word. No redesign, no refactoring, no new features. Just the customized code transferred to the new version of NAV. That’s what this post is about. Next step (and release) for us is going to be incorporating the new features that Microsoft brings us.

Just to give you a feeling: this product of ours is quite big:

  • 616 changed default objects – many tables
  • 2845 new objects – which doesn’t really influence the merge process, though, these objects might call out to default (redesigned) code or objects ..
  • +900 hooks into code – changed business logic, change of default behaviour of NAV

I was able to do this upgrade in less than 12 hours .. which is 4 times the time spent compared to last year. This is purely because of the significant refactoring that Microsoft has been doing. In numbers, this is what they did:

  • 437 new objects
  • 96 deleted objects
  • 3286 changed objects

Do your own math ;-). The 96 deleted objects alone indicate a serious refactoring effort. And refactoring usually means: upgrade challenges.

How I did my upgrade

I upgraded our product in these steps:

  1. I created a new VM with the final release of Microsoft Dynamics NAV 2017.
  2. I used my PowerShell scripts which you can find on my github. Namely, the ones you can find in the scripts folder under “NAV Upgrades“. The thing is, the script “UpgradeDatabase.ps1” is going to do a full upgrade. I didn’t want to do this, so I stripped the script a bit, as I only wanted to merge, because I expected quite some conflicts, just because Microsoft refactored quite a bit.. . So basically, I only executed the ‘Merge-NAVUpgradeObjects’ part. That script gave me a folder with conflicts and merged text files.. .
  3. And indeed: quite some conflicts. This was expected. We will go into the issues later on, but what I did to solve the conflicts is two things:
    1. I solved the majority just in the txt files with VSCode. Editing code in text files with VSCode makes so much sense – read below why.
    2. I created a database with all my merged objects to solve the more complex conflicts, like the completely refactored codeunit 80 and 90. It gives you the “go to definition” stuff and such .. .
  4. Compile code, fix errors and create a fob: Create a full txt file and import it in an environment where you can compile all, and create a fob. I had a few compile errors, but these were easy to solve.
  5. Perform the database upgrade, including data
    1. This one involves the upgrade toolkit. While testing this, it was clear that the default toolkit needed some work, as well as I had to include some of my own tables that became redundant thanks to the code redesign.
    2. Thanks to the “Start-NAVUpgradeDatabase” function as part of my scripts, this was quite easy. One tip: always fix the first error, then re-do the entire conversion, because the code in the upgrade toolkit isn’t ready for starting the data conversion multiple times .. .

After this final step, we had an upgraded database, including all upgrade code and upgrade toolkit for both the default application as my product.

Again, this doesn’t involve any significant code re-design to fit the new features. That’s what we will be doing from now on. :-)

So, let’s dive into the things that are worth mentioning (in my book (no, I’m not writing a book …)).

Refactored objects

During solving the conflicts, you run into objects that are quite refactored by Microsoft. I assume to know why they are doing this. Part of the reason must be facilitating more events. This needs to be done in refactored code .. . So let’s be clear: I totally support the fact that Microsoft is redesigning the code .. It’s a step we must take, although it doesn’t make the upgrade-life much easier.. .

These are a few points that I was struggling with.. :

COD414 and COD415 Release Sales/Purchase Document

OnRun moved to “Code” trigger. For the rest quite the same – so merging is quite fine .. . Though, I would have expected a bigger change. While you’re already changing the main trigger that will cause a conflict anyway – why not just refactor the whole thing. This is not really refactoring, this is moving and causing a conflict without any benefit…

COD448 – Job Queue Dispatcher

The Job Queue has totally changed. This is not a surprise, as it is running on a completely different platform, with a completely different concept, having multiple threads (workers?) waiting for the scheduler to drop something on its queue.. . I decided to ignore all my changes to the job queue, and start fresh. This will be something to look at after the upgrade.

COD5982 – Service-Post+Print

Microsoft removed a function “PrintReport” and put it on record-level, which means in this case: 4 places. I had one hook in this function .. and now it had to be placed on 4 places.. . This is the only place where “hooks” didn’t really serve me well as I don’t want to place one hook function in four places.. .

The redesign does make sense, though, so I’m not complaining. I created new hooks, and off we go :-).

COD82 – Sales-Post + Print

Similar changes (I like the consistency). Again, the PrintReport was moved to rec-level, which didn’t make my hook-moving going swell.. but I survived :-).

COD80:

Must be the most refactored codeunit together with COD90. the code seems to still be all there in CU80, only in (local) functions. So this was really just a matter of moving my hooks to the right local function. You could see that the code structure was still quite the same, so it wasn’t that difficult to find out where the hook had to go.. .

A few other things caught my mind though, that wasn’t always that easy ..

  • Changed Text constants: all text constants have a decent name now. Instead of something like Text30, it actually describes the text.. . This does mean a lot more code changes, and some more conflicts.. .
  • Changed Var names: a little more difficult. In some places, I noticed changes like “GenJnlLine2” renamed to “GenJnlLine” or “SalesLine” changed to “SalesOrderLine”. It is refactoring, it is making it more readable .. but in some cases, you really have to keep your head in the game to solve conflicts .. .
  • From Temp to no-temp (and vica versa). This was the hardest thing to do. And I still have to test everything to see if I merged correctly. The thing is: I have the feeling they completely changed the “Preview Posting” thing by doing stuff in Temp Tables and such. This meant that all of a sudden, I was getting Temp variabels in my hooks .. . So I really had to pay attention what to do in those scenarios.. .

I handled CU80 and CU90 manually in the database, and did all changes I did manually without automatic merge. It does make sense, because I could navigate through code to find out what to do in all 24 points I hooked into this codeunit. Tests will still have to show whether I did a good job or not ;-).

TAB37:

VAT calculation has moved to VATAmountLine table to “UpdateLines” function. Moving this function means moving your hook to another hook codeunit .. . It does make more sense, but in my book, this is one of the more challenging things to do ;-).

ApprovalsMgt Turned into Variant Façade

So, code like:

ApprovalsMgmt.DeleteApprovalEntry(DATABASE::”Sales Header”,”Document Type”,”No.”);

Needs to be changed to:

ApprovalsMgmt.DeleteApprovalEntry(SalesHeader);

We used this kind of code quite a lot in our product.. . Some work, but the refactoring makes so much sense. Love it!

Item Category

Default fields moved to “Item Template” .. . This is quite a big change, because we added 6 extra default fields .. and created some functionality on top of this. These default fields being moved to item template, meant we need to rethink our functionality as well .. . This is for after the upgrade, as in my opinion, “rethinking functionality” should not be part of the upgrade.

Pages

Pages is not easy. This is a refactoring-bit that I don’t like. Microsoft tends to put stuff on pages. Like:

  • ShowMandatory – added in NAV2015, as a property on a field. :(
  • Tooltips – Added in NAV2017: again a page-field-level property to add multi language tooltips :(
  • ApplicationArea – Added in NAV2017: again a page-field-level property to assign fields to certain application areas (to be able simplify (and hide fields on) pages for users).

I absolutely hate this design principle from the bottom of my heart. It’s unreadable, unmaintainable, unextendable, .. .

So if you would have changed or deleted any of these fields on pages, you would have conflicts. You get things like this:

In my case: LOADS of them.. . Thing is: also on moved fields you get this :(. I had conflicts on about all fields that we changed – just because of the design of these new features (page-field-level…).

This was just one example. Other very recurring conflicts were:

  • Code Modifications
  • Changed properties, like
    • Promoted
    • PromotedCategory
    • PromotedActionCategoryML
    • CaptionML (character problems)

Furthermore, on a lot of pages, they don’t have closing semi-colon in code .. which makes it difficult to compile after you (automatically) merged code after it. Here is an example of Page30

A few numbers:

  • I had 13 conflicts in the Customer Card,
  • 57 out of 88 conflicting objects were pages!
  • There was an average of 5 conflicts per page .. .
  • Worst ones:
    • Sales Order / Quote (and any document pages for that matter)
    • Customer Card / List
    • Item Card / List

I was able to handle them all in just textfiles.. . But no advise here – you’ll just have to work your way through them :(.

Deleted objects

In my case (BE upgrade from 2016 CU9 to 2017), a total of 96 objects were deleted. The chance you have references, table relations or whatever to any of these objects is quite big. This is something that you’ll need to handle after the upgrade while compiling your code. Very dependent of what and how you’re using these objects.

Why there are so many deleted objects. Well .. So much has been refactored .. like:

– reworked notifications

– reworked CRM wizards

– reworked job queue

– …

In my case, only one conflict here. Namely the “Credit limit check” has seriously changed. It works with Notifications now. We tapped into it, so we needed to revise the complete solution (which ended up in removing it :-))

Irregularities

What I was really surprised about is some irregularities in the code of Microsoft. Sometimes, this was really confusing. One example that comes to mind is the variable “RowIsNotText” in page 46. This variable is used to handle the “Enabled” and “Editable” property of some fields. You would expect this variable to exist in all similar “subforms”. Well – no. In fact, in all other subforms, there was the similar but completely opposite variable “RowIsText”. Be careful for this!

Upgrade Toolkit

Once I had compiled objects (all merged, imported and compilation errors solved), I started to work on my data upgrade. Of course I used the upgrade toolkit that comes with the product.

I had two issues in this.

The upgrade Toolkit contains errors in function: UpdateItemCategoryPostingGroups. It seems that a temp table in combination with an AutoIncrement doesn’t go well together. When you have multiple Item Categories, it failed on this function from the second entry .. . I did a dirty fix on this, doing this (just making sure the table is empty, so that the OnInsert code is being able to run.

Another issue was that Microsoft didn’t do a “INSERT(TRUE)” for the UpdateSMTPMailSetupEncryptPassword. The OnInsert needs to run, but Microsoft had put just “INSERT”, not “INSERT(TRUE”, which seemed like a bug (no GUID was created). We changed it to INSERT(TRUE).

Solving conflicts with VSCode

Just for the sake to get myself familiar with VSCode, I decided to handle conflicts in this editor. Sure, it’s not code, it’s just TXT files .. but why not :-). It is a code editor .. and text file or not – I’m editing code.

Here is how it looked like:

A few things that I really liked about this approach.

  • Github Integration: I saved all to github. This made it possible for me to see what I did to solve a conflict. Always good to be able to review everything.
  • File explorer / navigate between files. All files were just there. Very easy.
  • Open window on the side (CTRL + Enter). As you can see on the screenshot, I could show two files at once. Because of the “big” conflict files, this was quite convenient. Also to search the right spot is just a copy/past-in-search kind of thing without too much switching screens. Very easy.
  • Easy search (select + CTRL+F) and find previous (shift+Enter). As said, it behaves like a code editor. This was one of the places this showed really well. Like when you’re looking for conflicting control ids. Just select the id, press CTRL+F, automatically it will search for what you have selected, and all matches get highlighted.

  • Smart carriage return: it will end up on the place you want it to be – like you expect from a code editor, but not from a text editor.

I will do this every time from now on :-).

Conclusion

I was happy to be able to have an upgrade in only 12 hours. Taking into account that so much was refactored, plus the most conflicts I had to solve in over 3 years .. it was still “just” 12 hours. Still, it’s 4 times the time that I spent in my previous upgrade .. . I hope this doesn’t mean that someone who did 1 month last time, will spend 4 months ;-).

Microsoft has given us the tools to do this as efficient as possible. Yes, I’m talking about PowerShell. My scripts on top of these building blocks has helped me tremendously again .. I can only recommend them for you.

Hooks” helped me a lot again this time as well. Putting them into the refactored code was quite easy. I spent more time on some pages then I did on Codeunit 80.

So .. good luck, hope this helps you a bit.


NAVTechDays 2016

$
0
0

This weekend, while preparing for NAVTechDays .. I realized I haven’t even blogged about it. And I’m ashamed about it. THE best conference for Microsoft Dynamics NAV Developers in the world. And I forget to blog about it.

Let’s make this right

NAVTechDays 2016 is just around the corner!

Next week, from 15-18th of November, you will be able to enjoy the great conference “NAVTechDays 2016”.

And if you take your job as a Microsoft Dynamics NAV Developer or system administrator a little bit serious – you already registered. If not (already registered..) – you can still register here: https://www.navtechdays.com//2016/register

2 Pre Conference Days

This proves that Luc (Mr. MiBuSo) listens to you .. . In the evaluations of last year’s edition, the attendees clearly indicated that 2 pre-conference days would be very much appreciated. And so there are lots of workshops for you .. and of course, lots of them are already sold out.

15th November

  • PowerShell Black Belt (Sold out)
  • Deep Dive Eventing & Extension (Sold Out)
  • JavaScript for C/AL Developers (Sold Out)
  • Troubleshooting Essentials for SQL Server and Dynamics NAV (Sold Out)
  • NAV Application Architecture and Design Patterns (Sold Out)
  • Power-BI – Introduction (Sold Out)
  • NAV ALM using Team Foundation Server (Sold Out)
  • Extending the Data Exchange Framework (Seats available)
  • RDLC Advanced Reporting in NAV (Seats available)

15th November

  • RDLC Advanced Reporting in NAV (Sold Out)
  • PowerShell Black Belt (Sold out)
  • Deep Dive Eventing & Extension (Sold Out)
  • C# for C/AL Developers (Sold Out)
  • Troubleshooting Essentials for SQL Server and Dynamics NAV (Sold Out)
  • NAV Application Architecture and Design Patterns (Sold Out)
  • Power-BI – Advanced (Sold Out)
  • NAV ALM using Team Foundation Server (Sold Out)
  • Extending the Data Exchange Framework (Sold Out)
  • An introduction to Scrum (Seats available)

So you see – there is still room for you to take advantage of this great opportunity!

+300 people are already registered for these workshops. That’s amazing :-).

Conference Days

On the 17th, the real conference begins. You know, as every year, this conference is just very well organized.

  • lots of interesting content (19 people from Microsoft, and lots and lots of SMEs speaking)
  • 90 minute deep dive sessions in very comfortable seats (don’t worry – the content make sure you won’t fall asleep :-))
  • humongous screens (it’s a movie theatre.. )
  • Belgian beer (…)
  • Good food
  • Expo with DEV-oriented products
  • .. .

This is the agenda for the sessions:

  • Migrating to Events
  • Building cool experiences with O365, Outlook and more
  • Building extensions for NAV
  • Office 365 for NAV Techies
  • Bad habits of NAV Developers
  • Building the “Dynamics 365 for Financials” service
  • Real life Source Code Management
  • JavaScript Architecture: Turning Pain into Gain
  • Best Practices in developing Microsoft Dynamics NAV 2017 Extensions
  • Get some Smartness into Dynamics NAV / 365 with Notifications and Cortana Intelligence
  • The Power of Power BI and Dynamics NAV
  • Design Patterns in NAV 2017

Lots of interesting sessions .. If you want to know more about them, please go here: https://www.navtechdays.com/2016/sessions.

My NAVTechDays

This is going to be the most busy NAVTechDays for me .. doing 2 days of Workshops and 2 sessions (“Bad Habits” and “Best Practices for Extensions”). Wish me luck  .. and above all .. join! :-)

Conclusion

Small post for a big event with only one conclusion: if you decided to not go .. you made the wrong decision.

C U there!

The most popular “Hello World” sample ever!

$
0
0

The new development environment that Microsoft is working on for Microsoft Dynamics NAV Extensions, is a hot topic for us NAV developers. If you didn’t hear about it, well, it has been discussed over the various conferences during the last “conference season”, and Arend-Jan blogged about it extensively.

Very recently, Microsoft released a sneak preview on how the new development language (“AL” – although, even the future name is not decided yet..) is going to look like. They did that by releasing a piece of code – a “Hello World” extension, what else  – on github, which can be found here: https://github.com/Microsoft/AL . From the moment it was published, it was all over social media. Lots of tweets, facebook messages, blogs, … a lot of excitement … for a hello world example in a development language that doesn’t even exist yet. Go figure. :-)

Some stats

When you think of it .. this particular “hello world” code sample must end up in the Guinness Book of Records! It must be! I just can’t imagine a Hello World sample, in a non-existing language, on github, with more than:

  • 20 forks – people that are actively contributing to the sample
  • 56 commits – updates from various people
  • 25 pull requests

Just have a look at the “Network” (and don’t forget to scroll left and right :-)):

Part 1

Part 2

How insane is that .. for a bloody “Hello World” sample, and I repeat: in a development environment that doesn’t even exist yet (the contributors weren’t even able to compile the code!)?

How come?

Well, it’s not all that surprising. The current language is up for renewal – it has been for a looooong time. And the current path Microsoft is taking (Dynamics 365, AppSource, Extensions, Apps, Solutions Apps, …) .. is “forcing” them to take this development environment very seriously. And from what we can see – it’s looking good and very promising – though, a LOT of work ahead (as you can imagine)!

It’s a good sign that the development-world of Microsoft Dynamics NAV do already take it serious, and are clearly looking forward to having it. As mentioned in other blogs, there is going to be a preview at Christmas – and I know for a fact that a lot of us can’t even wait that long .. which might be the reason why there are already people contributing to a simple sample like this. Me personally, I couldn’t help myself .. and wanted to do my piece :-).

Is this really the most popular “Hello World” on github ever?

Well, no. Not even by far. That was just meant to be a joke. Here is a simple search on GitHub that shows you there are many other: https://github.com/search?utf8=%E2%9C%93&q=hello+world. Like this one with PhoneGap, which has an insane amount of 6895 forks :-).

But let’s not forget – I mentioned it three times – this is about a language that doesn’t exist yet. So in a way, yes, it must be the most popular “Hello World” of a non-existing language on github ever published (if not “the only”) :-). Also don’t forget: The team of Microsoft Dynamics NAV has put it on there for us to evaluate, talk about it, make noise, get people ready .. . Big thanks for that!

Microsoft Dynamics NAV 2017 Cumulative Update 1 is released – and how to download

$
0
0

Good news! The first Cumulative Update for Microsoft Dynamics NAV 2017 is available! So finally, we can start upgrading this wonderful new version (… I know, sounds weird ..).

Now, since this version, Microsoft seems to have seriously modified their way to provide cumulative updates.

How to download

The download is not available anymore like it used to be: providing your email address, wait for mail, and download from the mail. I guess that’s a good thing. No, now it’s available from the “Microsoft Download Center“. Lots of Microsoft products (updates) are available there .. so it makes all the sense that also NAV updates become available there. Here is the link to download Microsoft Dynamics NAV 2017 Cumulative Update 1.

But…

I did have to search a while before I could find my download.. :-/.

You need to select a language. This is strange.. . I should select a country version, not a language. I was searching for “Flemish” – but apparently, also Microsoft doesn’t think that’s a language :-). So, let’s go for “Dutch” .. . All of a sudden, the entire webpage gets reloaded into the Dutch language .. and on top of that, the bloody BE version is NOT available! So we don’t even speak Dutch in Belgium? (well, no, that’s why I was looking for Flemish, wasn’t I .. :-))

Apparently, the BE version is only available when I select “French” .. so I need to struggle my way through the Download Center in French to be able to download the BE version of NAV .. .

Good that I don’t have to download the Russian version any time soon… to not talk about the Czech version, which seems to be lost (sorry, Kamil) ..

There is room for improvement, if you ask me ;-).

Can I still automate the download process?

If you have followed my sessions at the PowerShots last year, or some of my sessions at Directions and NAVTechDays, you might have noticed that I automated the download process of Cumulative Updates (Kamil Sacek created a PowerShell function for it). Well, this doesn’t work anymore (for now.. ). And I don’t know a solution yet .. . There is no complete table of all downloads anymore, not on the page visibly, nor in the source of the webpage (in some kind of hidden JSON or whatever – like before). Nor I find a way to generate the URL from whatever info on the page.

So in short .. a new challenge ahead :-). I do hope the old way of providing CU’s will come back in the near future … because frankly .. this is NOT a good alternative..

The Download

The download itself looks different as well. Instead of a Self-Extracting zip file (basically, an exe-file) – now, it’s just a zip-file. No problem for me. I never used the “Self” in “Self-Extracting” zip. I just renamed the exe to zip, and started turning it into an iso. Basically, I can still do it the same way as I always did it – automated. The structures of the names is still preserved, which is good. I did however have to make a small change to my function “Get-NAVCumulativeUpdateDownloadVersionInfo” .. which is already on github.

Automate downloads of Microsoft Dynamics NAV Cumulative Updates

$
0
0

Yesterday, I blogged about the availability of the first Cumulative Update of NAV 2017. In that post, I talked about the fact that Microsoft recently changed the place where to download them.

This also means the way of automating the download had to be changed. In fact, the PowerShell function that was part of my repository (and built by Kamil originally), didn’t work anymore. I didn’t really know how to get around this – but luckily, I have a really clever colleague (well, I do have more than one ;-)) that did find a way in .Net. And we managed to transfer the logic to PowerShell .. (literally “transfer”, not rewrite … you’ll see what I mean ;-)).

I’m not going to tell you “how” it works – you can find out in the code. I’ll just show you what we created, and how it works.

Load-NAVCumulativeUpdateHelper

The main functionality is in C# .. and you can make it available by loading and compiling it by simply calling “Load-NAVCumulativeUpdateHelper”. In C#, you’re just that more flexible – and since we’re working in HTML, and figuring stuff out in HTML, it’s just much nicer to do so in C#. And .. the fact that this is the preferred environment of my colleague might have a reason in that as well ;-).

The PowerShell function is loading (compiling) two functions in memory and make them available to me:

static System.Collections.Generic.IEnumerable[MicrosoftDownload.NAVDownload] GetDownloadLocales(int productID)
static System.Collections.Generic.IEnumerable[MicrosoftDownload.NAVDownload] GetDownloadDetail(int productID, string language)

Let me show you some examples on how to use them.

This example, is going to show you a gridview of all download URLs in a gridview, and when you select one and press ok, it starts the download. It uses the “GetDownloadLocales” to get all the languages one-by-one:

 

Load-NAVCumulativeUpdateHelper
$Locales = [MicrosoftDownload.MicrosoftDownloadParser]::GetDownloadLocales(54317)
$MyDownload = $Locales | out-gridview -PassThru | foreach {
$_.DownloadUrl
Start-BitsTransfer -Source $_.DownloadUrl -Destination C:\_Download\Filename.zip
}

The other function “GetDownloadDetail”, you need to specify the Locale (like “en-US”), which can be used to speed up the process (no need to generate and navigate to all the URLs of all the “Locales”), like:

 

Load-NAVCumulativeUpdateHelper
$MyDownload = [MicrosoftDownload.MicrosoftDownloadParser]::GetDownloadDetail(54317, 'en-US') | Select-Object *
$MyDownload

You also see that the functions need a ProductID. Well, that’s how Download Center works .. If you have the ID, you know what to download .. and to get to that ProductID, let’s look into the next function.

Get-NAVCumulativeUpdateFile

This function is based on the good old function that Kamil Sacek once wrote – based on the ideas of my colleague. It is going to look at the Microsoft Dynamics NAV Team Blog to figure out the link to the Download Center – and yes – to figure out the ProductID. And obviously, going to use the Helper above to be able to start the download(s). It’s still limited to only search for the latest Cumulative Update. I’m working on a solution (probably a new function) to start the download from a specific blog article, or KB article.

The function above can simply be used in this way (and here are some more examples) which will simply start the download to the specified folder:

Get-NAVCumulativeUpdateFile -versions 2017 -CountryCode W1 -DownloadFolder C:\_Download

The function also creates a JSON with specifics about the download. Not because it’s necessary .. just because we can ;-).

Do I need the complete module to be able to do this?

No. You basically only need these two functions. So go ahead and automate! :-). The result is a much more stable way to automate these downloads, simply because Download Center doesn’t require you to run through a complicated login process ..

New Developer Experience Preview available!

$
0
0

For all us NAV developers, Christmas comes early this year. The Microsoft Dynamics NAV Team had promised us a preview on the New Development Experience .. and they delivered. It is available from today!

All you need to know to start is documented in their blogpost. It is useless to just repeat what is in there .. so please read it carefully and start playing! Let me just – for you convenience – sum up a few references that might be interesting to you.

As you might have expected, I have set up my own environment already .. :-). And all works like a charm. Please set it up yourself, and be as enthusiastic as I am :-). I have made a few screenshots to clarify a few things – although all went smooth, so you probably don’t have to go through them .. but anyway ..

When you are provisioning your VM, after a while, you’ll get details about your VM. Just navigate to the “PublicIP, and copy the DNS-name.

Browse to that copied URL. There, you’ll find a “View Installation Status” link on the right side of the page.

This shows you the status of the installation of the VM. Wait until everything is completed.

When completed and you have the same as the screenshot above .. the fun can begin!

Just log into the server (there is a link on the homepage that you can use) and you’ll find everything you want. I’ll leave the actual playing up to you. One thing I noticed is that Microsoft wants us to test with the web client, because the Windows Client doesn’t seem to work. You can open de web client with the desktop link

If you do want to test with the windows client, simply change the ClientConfig like this:

That way, the client will open to the right database, with the right credential type.. (Windows).

Enjoy! ( I know I will :-))

PowerShellGet & waldo’s PowerShell Modules

$
0
0

Since Microsoft Dynamics NAV 2017, the VM images that Microsoft (Freddy) makes available on Azure, contains the PowerShellGet module. Freddy mentioned that in one of his blogposts about the Azure images. For me this was quite new, but sounded interesting, so I looked into it.

What is “PowerShellGet”?

Well, PowerShellGet is a module that is now by default present on Windows 10 (and I assume also on Windows Server 2016 – which is by the way the new OS for all the upcoming demo environments!) that makes it much easier for you to find, download and install PowerShell modules that are made available in the PowerShell Gallery: https://www.powershellgallery.com/ . If PowerShellGet not available on your OS, you can download it from that web page.

So, when you installed the module, you get PowerShell CmdLets that can search for PowerShell modules, install them, load them, and make them available. So in a way, from within a script, you can make available PowerShell modules.

The thing really makes me think of the way we installed stuff on Linux many years ago, with the apt-get command.

I made my PowerShell modules available on the PowerShell Gallery

Which you might have expected – hence the title :-). You can find them easily when you search for “waldo” on the PowerShell Gallery. But that’s not really how you should use it. You have to use it from PowerShell (in my opinion).

Let’s show that in this small walkthrough

I just created myself an image of The NAV Development Tools Preview – January Update (oh yes, there is a new update on the New DEV tools! :-) ). But as said, this should work on any Win10 or WinServer2016 – or any system where you installed the PowerShellGet module.

Just open PowerShell as an administrator (since you’re installing modules, it seems like a good idea to do that as admin…). Let’s first search for my modules from within PowerShell:

Find-Module | where Author -eq waldo

The “Find-Module” is part of the “PowerShellGet” module, which is going to search for modules in the gallery. So in this case, I will search of all my modules in the gallery.

So if we execute this:

Find-Module | where Author -eq waldo | Install-Module

with the “install-module” (also part of the “PowerShellGet” module), it will “install all waldo’s modules :-)”. So when you get this:

You simply say “yes to all”, since you trust me (IF you trust me :-)). It will download and install the modules. You’ll find the downloaded files in “C:\Program Files\WindowsPowerShell\Modules”.

Since it will just have downloaded them, you need to still import them like you have to import any module before you can use them in your runspace .. So therefore:

Import-Module Cloud.Ready.Software.NAV

With this line, you can show all the functions that you have just made available by downloading the modules from the PowerShell Gallery.

get-command -Module Cloud.Ready.Software.*

So, let’s see if it works

Ok, let’s try some of my functions on this Azure image.

In order to do that, we will first import the standard NAV commandlets. And I made that somewhat easier with this function:

Import-NAVModules

This function will put all the commands that are necessary to load the default NAV CmdLets to your clipboard. Simply past and execute. I haven’t managed to import the module in the global context from within the function (seems like a bug in PowerShell) – but this did the trick quite nicely as well :-). So paste the content of the clipboard in the shell, and execute!

Now, let’s set up a new sandbox environment (database and NST) by simply copy everything from the “NAV” instance:

Get-NAVServerInstance -ServerInstance 'NAV' | Copy-NAVEnvironment -ToServerInstance sandbox

And start the development environment:

Start-NAVIdeClient sandbox

This is how I always build my development environments on dev images – local or in the cloud. To remove, you can simply do:

Remove-NAVEnvironment -ServerInstance sandbox

Be careful with that one. It’s removing the database and server instance – so be sure you have a backup when needed :-).

What if you have updated your modules?

Well, you can imagine there is a PowerShellGet function for that as well: Update-Module. You can simply do:

Find-Module | where author -eq waldo | Update-Module

This PowerShellGet-module is awesome! :-). Thanks, Freddy, for introducing me :-).

NAVTechDays 2016: Final thoughts

$
0
0

It’s been a while since NAV TechDays. And for a while I was thinking to skip the blogpost on how the latest NAV TechDays has been.. . But then again – the convention is without doubt the best one around for NAV developers, while it’s still “just” a private initiative from Luc – so it deserves all the attention it can get. :-).

Every year I write to you how perfect NAVTechDays has been .. . Well, this year, no difference! Basically, Luc did it again and put together a fantastic conference !

For years (6th edition last year) NAV TechDays is THE must-go-to conference for Microsoft Dynamics NAV developers. And it appears that most of us (except for the French developers – there was only 1 attendee from France (??) …) realize that. This year, Luc managed to get 1044 attendees to Antwerp, Belgium. 1044 of you that faced the awful Belgian traffic to get to NAV TechDays. And as I always say: who decided to not go, was awfully wrong! It’s a time for you to be able to be “fully” into the product. In an atmosphere that is especially set up for developers, to focus in the most comfortable way possible on the product, and its future.. . An atmosphere you will not be able to get at home .. .

So yeah, basically I’m writing a write-up like I have done for years: It was perfect again! So this write up might be getting boring – but I must say – NAV TechDays most certainly was not!

Yet again, all was very well organized, in my opinion. Great food, beer tasting, perfect session rooms (movie theatre quality), 90-minute deep dive sessions, perfect speaker infrastructure, big screens, … well, all an attendee (and also a speaker) could wish for. Great stuff! And thank you so much Luc, for taking the time, effort and above all the risk and responsibility for putting this together for us each year! Great stuff!

2 Pre-Conference Days

The biggest change with previous years was the extra Pre Conference Day. 2 instead of 1 this year. This way, the attendees had the chance to get even more value out of making the trip. Over 300 attendees decided to take the opportunity to dive into the deep dive workshops during the Pre-Conference Days (which I mentioned in my previous blog). I heard many great things about it, so I’m sure it’s going to be there on the next NAV TechDays as well. Fingers crossed :-).

All sessions are online

A big differentiator with all other conferences is the fact that all these sessions are highly repeatable (aren’t we all trying to get into this repeatability-mode? :-) ). What I mean is: they are all recorded and online, available for everybody – no matter if you attended the conference or not. Now how’s that for a community-initiative? I don’t know one conference that does that. This is a big differentiator, and when you look at the amount of views, it’s highly appreciated by the community.

Luc has put everything on his youtube-channel. Have a look at it. Highly recommended!

My NAVTechDays

For me, it was quite a busy conference. I had two Pre-Conference Days on “PowerShell Black Belt”. Two full days of PowerShell-ing that hell out of ourselves! I always enjoy telling people about PowerShell – I just hope all attendees enjoyed themselves as well :-). The feedback was overall quite OK, so I’m happy with it :-). It’s why I do it.

During the conference, I covered two sessions. That’s quite enough, I must say ;-). For me .. and probably also for you ;-). First session was together with Vjeko and Gary about “Bad Habits of NAV Developers”. It’s was meant to be a somewhat ludic session, and ended up to have been voted the best session of the conference (based on the points you gave on all the difference sessions). Very proud about that! You can watch it here:

My other session was somewhat more serious: “Best Practices in developing Microsoft Dynamics NAV 2017 Extensions”. And to be clear: I was talking about Extensions v1: the PowerShell way. I got a comment that there was a bit too much PowerShell in it .. . Sorry for that .. in my opinion PowerShell is one of the “problems” people have with building Extensions .. so yes, I decided to give it a decent focus. Here you can watch that session:

NAVTechDays 2017

And yes, we’ll have a NAVTechDays next year as well! Needless to say that I’m very much looking forward to that! :-) .. . So book your agendas for next year 16th and 17th of November 2017!

Some impressions

I always like to end with some impressions of the conference:

One of the slides during our “Bad Habits” session…

One fun addition: a “throwable” microphone .. to throw to people that had questions. Fun! :-). Until Vjeko throws it and almost puts someone in the hospital ;-).

Me, being serious…

And less serious …


How to find Cumulative Updates for Microsoft Dynamics NAV?

$
0
0

You might have noticed, I stopped updating the platform updates. I hope you don’t mind ;-). It just didn’t make any sense anymore when Microsoft moved it to a different location.. .

The main reason why I was providing that on my blog, was the “googleability” of the content the updates. All was published on PartnerSource, which meant, nothing was searchable in google.. . And we all know the experience to search for something in PartnerSource ;-). So to help you with that, I copied the content of the updates to my blog, and referenced the KB article. This was quite manual work, and I have spent hours and hours to maintain it.

Good for me .. Microsoft has changed their way of working. Google will actually take you to the exact CU where something was fixed on the Microsoft Support Site. You just have to know how to search.. .

Tip

It’s quite simple, just provide enough information to google for it to find the right stuff, like:

  • The NAV version
  • The error message
  • “cumulative update” if you’re actually search for one.

A few examples:

What CU should I download?

So IF you find a CU where your specific problem was solved .. I don’t recommend you to download that one. Just download the last one available. All updates are cumulative, so your problem will be fixed in a later version as well.

How do you find that latest version?

Aleksander Totovic published a few interesting links for that on a recent blogpost. Here you can find all CU updates from all supported versions:

And last but not least …

Of course, there is only on right way to download the latest Cumulative Update: With PowerShell! :-)

I described it here: http://www.waldo.be/2016/12/09/automate-downloads-of-cumulative-updates/.

Make the New Developer Tools available on a local machine

$
0
0

On Titter, Mark Brummel was wondering on how we would be able to get this wonderful new DEV environment locally, to be able to play with whenever we don’t have an Internet Connection. Well .. by coincidence, I had been trying to wrap my head around that a few weeks ago .. but never wrote up a procedure to do that. Well .. to help an old friend .. let’s do just that ;-).

Step 1: Install a new cloud-version of the DEV Tools.

You probably did this already. Just go to http://aka.ms/navdeveloperpreview and create yourself a VM.

Step 2: Create iso and zip of the needed resources

The resources that we need, is the DVD, the VSCode extension and sample code. I will make it easy on you. Just execute this script, and you’re good to go:

https://github.com/waldo1001/Cloud.Ready.Software.PowerShell/blob/master/PSScripts/NAV%20Developer%20Tools/CreateISOandZipOnAzureImage.ps1

You see it will find, install and use my PowerShell modules because it needs it to create the ISO and Zip files. When the script is executed, you’ll end up with two files which you need to copy to your local environment.

This can obviously take a while .. .What I did, I copied them on the local “C:\_Installs” folder on my local VM where I wanted to make these new Developer Tools available:

Step 3: Install NAV from the downloaded DVD to your local environment

I am all into local VMs. I never install NAV on my local host, my laptop. I always have an empty VM snapshot with Win2016 ready to install anything on it. So that’s what I have used to do so here as well. If you do it different, then the next steps might be different for you.. .

Again, I want to make it easy for you, and I have foreseen this script for you:

https://github.com/waldo1001/Cloud.Ready.Software.PowerShell/blob/master/PSScripts/NAV%20Developer%20Tools/InstallOnLocalEnvironment.ps1

If you do it exactly as I did (Win2016, copied to C:\_Installs, …), then you should be good to go with this script. Else, you can just try to either change the parameters at the beginning of the script, or read the script and figure out these steps yourself, and do it manually (or step-by-step in PowerShell):

  • Install NAV from the ISO file. The config I use is simply a full install of NAV
  • Unzip the NewDEVTools.zip to the folder “C:\DEMO\New Developer Experience”
  • Install VSCode
  • Install the provided al-extension in VSCode (can be found in the folder “C:\DEMO\New Developer Experience”)
  • Create a Navision_main instance (not really necessary, but it just mimics the AzureVM environment
  • Create a WebServerInstance on Navision_main as well

As said, all these steps are done by the script as well – so if you’re lucky (I always seem to be ;-)), you just need to execute the script, and you should be good to go ;-).

Enjoy!

Manage Web Services in Extensions

$
0
0

At this moment, I’m working on an Extension for Dynamics 365. One of the requirements is basically “publish an interface to an external packet”. In other words: “publish a bunch of web services”.

Now, I’m always trying to find the “best practice” for different circumstances. That’s also why I’m so deep into PowerShell – I think we should automate these “Best Practices” as much as possible ;-).

But in this case – handling web services for Extensions– I still haven’t found one – so it has become kind of a workaround instead of a best practice.

A WebService should be part of the codebase

As said, we are developing an extension. And part of this extension is a bunch of web services. The way to make this possible in an extension is to export the webservices to a package with the PowerShell cmdlet “Export-NAVAppTenantWebService“.

In the background, it’s going to export some data from the “Web Service” table (well .. not really that table, but that’s not important for this blogpost) to an xml file. In short: it’s going to do an export, based on data in the DEV environment.

Now .. this “data” should be part of the code. Why? Well, just assume when someone “forks” into the codebase, he probably needs to set up a new DEV environment, apply the deltas, .. and generate the Web Services .. . Because they need to exist at the moment he’s going to build and test the extension.

In a way, I always try to do this with PowerShell (as you might have figured…), because basically I build a DEV environment with PowerShell – and the scripts that are responsible for that, are part of the source code management of my Extension – so the creation of data like this is still “sourcecontrolled” (if that’s a word at all).

How can we generate these web services?

Well, we have the “New-NAVWebService” PowerShell CmdLet that comes out of the box. Unfortunately, this doesn’t give us what we want. Because what we want, is to be able to generate a “tenant”-webservice .. basically a webservice where the “App Tenants” flag is “false”, like:

The “New-NAVWebService” cmdlet doesn’t do this.. . It creates a normal webservice. And that’s not good. We would have to manually change the flags to be able to export it to the xml.. .

Is there a solution?

Well, I didn’t solve this in a way I feel comfortable with, because I just can’t think of a decent way on how to work around this. In C/AL, it is possible to create the Web Services with code like this:

Where the “WebServiceAggregate” is a variable of the Record “Web Service Aggregate” which is temporary (important!)!

So how can I create this is in PowerShell – well, I would be able to:

  • Generate a text-file of a codeunit in PowerShell
  • Import it
  • Compile
  • Run
  • Remove the codeunit

But really .. that’s not cool, right? Anyway, you can find a raw version on my github.. . I tested it .. it does work .. but it really feels like it shouldn’t ;-).

My Workaround

So my actual (current) workaround is a simple one. Publish the web services (based on a certain prefix), and open the web service page. Then you should manually de-select the “All Tenants” checkmark for all concerning web services.

I made this into a PowerShell script that I include in every single Extension:

https://github.com/waldo1001/Cloud.Ready.Software.PowerShell/blob/master/PSScripts/NAV%20Extensions/3_CreateWebServices.ps1

Conclusion

In my opinion, this is a part where Microsoft shoud come in and foresee a “New-NAVTenantWebService” CmdLet or whatever.. . Should be easy.

If anyone has a better solution – please share :-).

Microsoft Dynamics NAV Source Code Analysis – LineCount

$
0
0

Recently, I came across this nice graph: http://www.informationisbeautiful.net/visualizations/million-lines-of-code/ . Pretty cool to see the comparison. But – obviously – the next question for us Microsoft Dynamics NAV developers is:

How many lines of code does Microsoft Dynamics NAV contain?

Good question .. and an interesting one. I always told people things like “it’s not easy to navigate in millions lines of code that you didn’t write yourself”. Assuming “millions” was a safe bet. But is it really? When I talk to people, sometimes I hear 2 million, and others even say “6 million”.

So, that’s about the same lines of code as a F-22 Raptor Fighter Jet .. or even the Mars Curiosity Rover. Honestly .. I don’t think so ..

Here are the real stats

This is what I came up with – I’m not saying it’s 100% .. but it should be pretty close to the real factual amount – it’s not an estimate at all  . As the application “Microsoft Dynamics NAV” is much more than code alone .. the first thing I did was simply exporting all objects, and count the lines. So, to describe the application of NAV, including all (C/AL code, Metadata, descriptions, properties, .. whatever you can think of), for the W1 (only the ENU language), we need 3044075 lines of text.

“Only” 3 million lines, people. And that includes all the stuff we are actually not talking about when we talk “code”.

What about the C/AL code only??

Now THAT is the question I really wanted to answer. But that’s quite difficult to calculate. Because in that case, we need to analyze code as being code, not just text that can be both code or any other metadata.

You might remember this post, where I analyzed the “event publishers” with some ingenious dll that one of my developers has come up with.

Well, this dll has evolved to analyze code as well. Too bad I can’t share that, but what I can share are the results. And apparently, the total number of lines of C/AL code in Microsoft Dynamics NAV is “only” 464447, spread over 47729 places (functions, triggers, …).

When we’re at it…

This is fun stuff. So why not go a little bit further .. . I did some more checks, and here are a few things that I have come up with ..

As said, there are 464447 lines of C/AL code, spread over 47729 places. Basically, this means there is an average of 9.73 lines of code per function.

On top of this, the maximum amount of lines in one function is 801 lines. The minimum is 1 (surprise :-)).

The Top 5 Functions with the highest linecount is this:

NAV2017 W1 CU3LineCount
Page<Reservation>.PROCEDURE<DrillDownTotalQuantity>

802

Codeunit<Intelligent Info Manifest>.PROCEDURE<ManifestText>

734

Codeunit<Office Add-In Sample Emails>.PROCEDURE<GetHTMLSampleMsg>

500

Codeunit<Item Tracking Navigate Mgt.>.PROCEDURE<FindTrackingRecords>

492

Page<Navigate>.PROCEDURE<FindRecords>

484

You would have expected Codeunit80 and 90, wouldn’t you. Well, they were redesigned in NAV2017. In NAV2016, this would be the result:

NAV2016 BELineCount
Codeunit<Sales-Post>.<OnRun>

1286

Codeunit<Purch.-Post>.<OnRun>

1276

Page<Reservation>.PROCEDURE<DrillDownTotalQuantity>

802

Codeunit<Item Tracking Navigate Mgt.>.PROCEDURE<FindTrackingRecords>

492

Page<Navigate>.PROCEDURE<FindRecords>

484

Going further, what about things we do want to know, but like to avoid, like

Code on pages

Code on pages doesn’t necessarily need to be a bad thing. Pages is UI, UI can have code (formatting, matrix, wizards, ..) .. so you sure can have code on pages. Though, personally, I think it should always be checked. So, let’s check.

Well, in total, there is a staggering 77762 lines of C/AL code, spread over 17948 procedures and triggers in pages.

Here is an overview of the Top 10 pages with most code in NAV2017:

NAV2017 W1 CU3LineCount
Page<Reservation>.PROCEDURE<DrillDownTotalQuantity>

802

Page<Navigate>.PROCEDURE<FindRecords>

484

Page<Inventory – G/L Recon Matrix>.PROCEDURE<Calculate>

227

Page<Inventory – G/L Recon Matrix>.PROCEDURE<MATRIX_OnDrillDown>

190

Page<Item Tracking Lines>.PROCEDURE<RegisterChange>

169

Page<Item Tracking Lines>.PROCEDURE<SetSourceSpec>

155

Page<Navigate>.PROCEDURE<ShowRecords>

147

Page<Item Tracking Lines>.PROCEDURE<WriteToDatabase>

133

Page<Apply Vendor Entries>.PROCEDURE<CalcApplnAmount>

127

Page<Apply Customer Entries>.PROCEDURE<CalcApplnAmount>

127

Besides the expected Matrix-pages, there are the expected “legacy”-pages as well: Reservations and Item Tracking – in my book the number one thing in NAV that really needs to be refactored.. .

So, can we conclude that most of the code is on Matrix-pages and Wizards? Well, unfortunately not. Without Matrix and Wizard pages, we still have 63104 lines of code, spread over 15201 places.

So, what is my conclusion?

Code analysis is a good thing to do. Not to point out the bad things, but to be aware of the places that might need attention. All code, the amount, on any place, probably has got his reason and definitely has got its history. The challenge for a Microsoft Dynamics NAV developer is to understand all this, and to make conscious decisions when modifying it.

What I did here, is I pointed out the “worst” cases in an old codebase. A codebase that people don’t want to change too much, just because of upgradability. And, a lot of these cases, is very easy explainable. Only one area really comes to mind that is not explicable: Item Tracking / Reservation (in need for refactoring for years already)

An average of 10 lines is very good. Just think of this – to compensate functions like the above top 10, you really need a lot of “good” ones, to end up with the decent average of 10. To conclude, let’s compare NAV2016 with NAV2017:

NAV 2016NAV 2017
Total C/AL Line Count

444953

464447

Total Functions

44137

47729

Average number of lines per function

10,08

9,73

Maximum number of lines in one function

1286

802

Code Lines on Pages

72191

77762

Total page functions that contains code

16998

17948

All good, but what I don’t like is the increasing amount of code on pages. I haven’t looked into that, but I honestly don’t believe it’s all UI formatting… . And in my book .. business logic just doesn’t belong on pages.

More to come

Doing this blog, I got more ideas that I would like to work out. Two of them are “Commits” and “Cyclomatic Complexity”. If you have more stats you’d like to see – just provide a comment, and I’ll see I get to it!

Disclaimer:

All the stats were made based on an internal tool that is still in beta. So I don’t want to guarantee the exact figures. However, we didn’t find any reason to doubt the output just yet.

Microsoft Dynamics NAV Source Code Analysis – Triggers & Procedures

$
0
0

On my last blogpost on Source Code Analysis (which focused on the LineCount), I got the remark whether it would be interesting to drill into “statements”. I didn’t really know what he meant with it – but it made me think and drill down into the usage of the procedures.

You might remember we talked about the “places” in NAV where we were writing code, like: OnValidate trigger, custom functions, OnInsert, .. but also OnDrillDown on a control of a page .. these things. And the more you drill into it, the more things you wann find out ;-). So, the below on what came to my mind during this quest.

What are the places we write code?

In order to get some stats on that, we need to first define what we mean. Well, I’m interested in the procedure types, like “OnInsert”, “OnValidate”, but also the created functions. Well, here is the entire table – and you see, most code is on manually created functions, but still, there is a significant amount is on triggers:

CountName
24676Procedure
5948OnValidate
4465OnAction
2218OnDrillDown
1907OnAfterGetRecord
1281OnPreDataItem
1063OnLookup
982OnOpenPage
476ExportOnBeforePassVariable
442OnPreReport
440OnRun
398OnDelete
375OnAssistEdit
371OnInit
311OnPostDataItem
308OnAfterGetCurrRecord
303OnInsert
244OnNewRecord
169OnModify
162OnRename
146OnFindRecord
130OnQueryClosePage
122Event
114OnDeleteRecord
111OnInitReport
94OnNextRecord
74OnPostReport
65ExportOnBeforePassField
63OnInsertRecord
49Documentation
48ExportOnAfterGetRecord
44OnModifyRecord
24ExportOnPreXmlItem
23OnClosePage
18ImportOnBeforeInsertRecord
16ImportOnAfterAssignVariable
13OnPreXmlPort
9ImportOnAfterAssignField
8OnPostXmlPort
7OnBeforeOpen
6ImportOnAfterInsertRecord
4OnInitXmlPort
2ImportOnAfterInitRecord

The most used functions

Well, to be honest, this isn’t going to be very “true” stats, because I was not able to measure the amount of “OnModifies” or “OnInserts” or any of these trigger-calls for that matter. So I’m just focusing on all created functions/procedures – you know, the ones you can create yourself. Here is the Top 10 functions mostly used “somewhere” in C/AL:

FullNameWithIDUsedByCount
Table<Currency Exchange Rate>.PROCEDURE<ExchangeAmtFCYToLCY>

156

Report<General Journal – Test>.PROCEDURE<AddError>

155

Codeunit<Reservation Engine Mgt.>.PROCEDURE<InitFilterAndSortingFor>

150

Table<Currency>.PROCEDURE<InitRoundingPrecision>

148

Codeunit<Calc. G/L Acc. Where-Used>.PROCEDURE<InsertGroup>

126

Codeunit<CalendarManagement>.PROCEDURE<TimeFactor>

124

Codeunit<AccSchedManagement>.PROCEDURE<GetDimTotalingFilter>

120

Codeunit<XML DOM Management>.PROCEDURE<AddElement>

113

Report<Vendor Pre-Payment Journal>.PROCEDURE<AddError>

111

Codeunit<Job Calculate Statistics>.PROCEDURE<ShowLedgEntry>

108

Note: It does NOT count the OnRun-trigger of a codeunit – so the Sales-Post.OnRun is not taken into account, or any “CODEUNIT.RUN” for that matter. But let’s be honest – we shouldn’t be calling code like that anyway! (no, we shouldn’t!)

Most used function names

So, let’s drill into naming conventions. First, just as a teaser, I like to know what the most used function name is – no – let’s show the Top 10 of most used names:

CountName
238InitializeRequest
104Code
93ShowDimensions
77GetLocation
75Initialize
69Caption
57ValidateShortcutDimCode
50Load
50SetUpNewLine
48GetItem

Surprised? I asked this question on Twitter, you here you see the results – most of them were completely wrong :-).

To try to not confuse you (too much), there is a difference between the amount of times someone used a certain name to create a function (the above table), and the amount of times a certain function was used in some code.. . Let’s see about the latter one. Here are the function names, that are most used in C/AL code:

NameUsedByCount
AddError

869

MATRIX_OnDrillDown

596

ValidateShortcutDimCode

454

MatrixOnDrillDown

440

ShowDimensions

310

GetLocation

267

FilterReservFor

264

AddElement

264

MATRIX_GenerateColumnCaptions

212

UpdateSubform

192

GetItem

185

Initialize

176

Most Used Function prefixes

That made me think – what about prefixes. Well, that’s somewhat more difficult, because function prefixes can consist of a variaty of number of chars. So I decided to:

  • Only focus on prefixes with 2,3,4,5 and 6 characters
  • Don’t include the events (which basically start with “On”)
  • And only take the ones that make sense to me, like exclude “Ge”, because it most likely “Get” is the intended prefix.
3207Get
2373Set
1233Update
1093Calc
1079Check
869Show
845Insert
842Init
822Create
700Export
507Is

This is the top – and I like all of them ;-).

Unused functions

Vjeko came to me, and said – hey – you should try to list the functions that are NOT used anymore. Well, yeah, he kind of has a point – but I didn’t really think it was useful to do that, as Microsoft probably does their own analysis .. but still, it triggered me, so I did some queries. And I was amazed by the outcome. Really .. the amount of unused functions is a dazzling 1543, spread over 413 Pages, 416 Codeunits, 356 Tables and 358 Reports. And this is what i have taken into account:

  • Only self created functions/procedures
  • Left out the event subscribers
  • Left out all functions from Codeunit 1 (as these are called from the runtime)
  • Left out all the functions from “library-Random”, as there are functions for the testability framework

And still that many. Probably all of them having their own story, I’m sure, but I’m also sure that a lot of them are quite frankly “boat anchors“. I can’t just print them all here on the blog, but let’s just shed a light on two random cases:

Codeunit<Sales-Post>.PROCEDURE<CheckWarehouse>

As said, this function is never used .. although it has quite a lot of code! For me, this is a real boat anchor. Because we can find a “CheckWarehouse” at many locations – also on SalesLine – so my assumption is that it was redesigned, but left in CU80 .. . We can find more in CU80, by the way.. .

Codeunit<G/L Account Category Mgt.>.PROCEDURE<GetIncomeSalesReturns>

This is a type of function that I come across quite frequently. This function only returns the value of a text constant, by:

EXIT(myTextConstant);

Thing is, the function is never used, but the text constant is used all over that codeunit. So – what is the point of the function?

Local or global functions

We always advocate “be as local as possible”. Only methods should be global, all the rest should be local. Well, at least that’s the guideline .. so I though – let’s see how many are local and how many are global. These are the results:

  • Local Functions: 13481
  • Global Functions: 11195

Download

I exported the object-collection that I created to create this blog. You can find all results in that. So if you’re interested in all unused functions, you can figure that out from this download: NAV_10_15140_BE_ProcsFunctions

Disclaimer

As I put in my previous blogpost, all this data was crunched by an internal tool that I can’t share, and is still in BETA. Although I have no reasons to doubt the outcome (and believe me, when I saw the unused functions, I sure doubted our tool  but after quite a lot of checks, conclusion was that it has to be right), I just can’t officially guarantee the 100% correctness of the results. If you have any reason to doubt, please contact me :-).

Directions ASIA and NAVUG Summit EMEA

$
0
0

It’s not conference season and yet .. I have never been this busy during this part of the year for conferences. That must mean that new stuff (conferences that is..) is coming up. Let’s spend a very short blogpost on what’s coming up for me..

Directions ASIA

In fact, at this very moment, I’m on the plane to Bangkok. The EMEA team is organizing its event for the first time in ASIA. I don’t know at all what the experience is going to be like – but I’m looking forward to it! It’s also the first time that I will be traveling for 30 hours, with a 6-hour time difference .. for only being there for 72-hours. Let’s just hope Mr. Migraine stays home ;-).

Anyway – I was asked to do some sessions – so who am I to refuse this wonderful opportunity ;-).

My Sessions

The sessions (and workshop) are quite the same as I did for Directions EMEA and US:

  • My First Extension
  • Bad habits of NAV developers
  • Building Extensions with PowerShell (workshop)

After the sessions in US and EMEA, I should be able to do these with two fingers in my nose (which I don’t know this is a valid English expression – but you know what I mean (no literally ;-))).

NAVUG Summit EMEA

And there will be NAVUG Summit EMEA in two weeks. This is going to be a whole different concept and experience for me. NAVUG is a “User Group” – so focus is not ISVs or Partners, but customers! A whole different set of questions, interests, focus, … . This is going to be interesting. I just hope I will be able to fill the expectations with my two sessions:

  • Are Extensions the future of Dynamics NAV? – I will be co-presenting this one with Mark Brummel
  • PowerShell: Keep Calm and Script On

Stored Procedures on a NAV database?

$
0
0

Recently, I came across a customer, where a partner had implemented some Stored Procedures on a Microsoft Dynamics NAV 2017 database.

Now, for any change on SQL level that comes to my ears, my first reaction is always “Nooooooooo“. Or as a pictures tells more than a thousand words:

First thing I did was googling to see if the community shared my opinion. Apparently there is nothing to find about recommendations on this: nor good nor bad. I did find a few blogposts on how to call SPs from NAV (even from my own ;-)) .. . Well – time for me to start preaching about the disadvantages :-).

My recommendation is quite simple: don’t do any manual changes in SQL Server on a NAV database. And I mean ANY change. Like:

  • Altering indexes
  • Stored Procedures
  • Views
  • Table triggers

Well, may be not “any” change .. I’m not talking about maintenance jobs, of course – you have to maintain a NAV database like any other.

But let’s focus on Stored Procedures (although a lot of my arguments could count for any change on SQL). And yes, I expect lots of people not agreeing with this post. Before you do .. try to see it from my point of view: being responsible for +300 customer databases in a variety of NAV versions.. . “Repeatability” comes to mind .. . “No exotic, strange, unsupportable, … solutions” as well.

So, why don’t you like Stored Procedures on NAV databases?

For me, a solution based on Stored Procedures smells a lot like “I just do it like that because I can”. It’s the combination with NAV that doesn’t make it really interesting if you ask me. As you know, a lot in SQL Server is managed by the NAV environment, like:

  • Creation of indexes
  • VSIFT views
  • Creating/changing Tables
  • Creating/changing Fields

you doing some manual changes on top of that, means that it is NOT managed by NAV, which might result that (1) your change might just disappear because it’s overwritten by NAV, or (2) your change results in a conflict (error) with the NAV system.

Schema Sync:

Remember the PowerShell-command “Sync-NAVTenant” that is needed sometimes to sync the metadata with the SQL Schema. A Sync. Between two things: a metadata-table and the actual schema on SQL. Any manual change that disturbs the two might end up in a complete “un-ability” to ever synch the schema again. A “disturbance in the force” is easily and unintentionally done. To fix things like this can be very time-consuming. Ok, may be nothing to do with Stored Procedures – but something to think about nevertheless..

Separation of business logic

Any software doesn’t need to be just “working”. Things like “upgradability”, “maintainability”, “readability”, … are equally important! When you are introducing Stored Procedures as being part of your solution, you definitely are NOT making it more readable, maintainable, .. .

Just imagine that in a few years, when you’re enjoying your pension, other people have to extend your solution .. with parts in SPs, and parts in C/SIDE. You really think they are going to figure out that cryptic T-SQL? As business logic. Or part of it?

NST Cache

One of the biggest reason to NOT use Stored Procedures. The two things you do most when calling an SP: read and/or write.

When you just want to read data – just do it with queries and C/SIDE? You’ll use the NST cache, which even might speed up the process as well.. .

When you want to write data – then you actually should just go on a pension right away – you don’t belong to work on NAV databases. Sorry, but that is my opinion. People lose fingers over this in my company ;-).

Writing on SQL server without going through the NST means that everything you will touch, will not update the cache, even more your cache will be out of sync, resulting into showing different data in NAV than the actual data in the tables. Things like this lead to a very unstable and unreliable system and should be avoided by all times.

No Flowfields

You won’t have access to flowfields because these are calculated by the service tier. I have seen people “simulating” the business logic in T-SQL that results in the same result – but then again – what if I change the “business logic” (i.e. calculation) of the flowfield? What if Microsoft updates it after an upgrade?

Should I say more? Duplicating code or business logic is like code cloning. Don’t do it! It’s paving the road to disaster!

Enough alternatives

There are indeed enough alternatives! We can use queries in combination with ODATA web services to efficiently read data on SQL Server. We can use Web Services to call for business logic from 3rd party applications. We can even call a codeunit from PowerShell! There are many types of decent integrations – so please, always consider one that doesn’t require a change on the SQL Server database!

Only when it’s absolutely necessary AND when you know 100% what you’re doing

I rarely see a situation where there is no way around. Performance can be a reason. Like I have been truncating a table because a DELETEALL just took too long. But I see lots of situations where SPs might have been avoided, and were considered a good solution “because he/she was not familiar with the concept of web services”.

All I can say is: make sure you know what you’re doing. And if you do go ahead with it, make sure you document it to all parties that are involved with the database – because they might not understand why you’re doing it the way you’re doing it.. or even not realize that you did.

Conclusion

In my 15 years in the world of NAV-craft, I can only think of two examples where I have actually implemented something straight onto SQL Server. One of which was because there were no web services just yet.

The first reaction I have when I see solutions with Stored Procedures is “that must be ‘Job Protection'”, because usually, there is a much more elegant solution available. Please don’t make any solution even more complex as it already is – unless there is really no other way .. .

Disclaimer

Needless to say that this blogpost is completely and only based on my own opinion.


PowerShots 2017 just kicked off

$
0
0

I’m sitting in the plane, after I just kicked off my first PowerShot for NAV2017 in Sweden. It was not “the first” PowerShot of this year, because Vjeko actually started one day before me in Norway :-).

What are PowerShots?

You might have read from previous years that we (Cloud Ready Software) have been “touring” Europe, advocating various topics on NAV – with a strong focus on “Cloud Development”. You can read some overviews on my blog:

Well, also this year, we’ll be touring Europe to evangelize Microsoft Dynamics NAV. This year we have a very strong focus on “Extensions”. In fact, it is a two day in-depth training on how to prepare for and move to developing Extensions 2.0 with VS Code for Dynamics NAV and Dynamics 365 for Financials.

Can I join?

Of course! It’s a training we want to offer to the entire community. In fact, you can find all info here. But to make it easier for you, here are the links where you can register:

How was Sweden?

Well, I always look forward going to Sweden! In all aspects, it’s a great place to deliver the PowerShots:

  • Very nice infrastructure
  • Nice people (and I mean that :-))
  • Nice food

Now, this was my “first” PowerShot, and to be able to offer it in as much countries as possible, we decided this year to not do it with two or three speakers – but only one speaker. So – lots of pressure :). Is the content OK? Are the labs well-prepared? How about timing? Am I going to be able to answers questions – and which questions will there be asked? And above all: are the people satisfied?

Well – on most questions I don’t have an answer to just yet .. basically because I haven’t had any feedback on the evaluations. But my feeling was very good. I had good questions, good interactions, and during the labs, some people just didn’t want to take a break :-). Perfect attitude ;-).

I’m looking forward going back next year for sure!

And I’m looking forward to my other PowerShots in other countries.

Convert C/AL Objects to AL with PowerShell

$
0
0

If you are a NAV developer, you probably have heard about the upcoming “New Developer Tools”. And if you didn’t, it’s very much time for you to catch up with the readings about this, like:

And when you read this blog from Freddy– you might as well got started in Extensionsv1, and wonder how you can convert all your work to Extensionv2 – or basically – from C/AL to AL.

“The Converter”first

Well, the latest update of the NAV Developer Preview has been released (you should read all about it here), and along with that release, we got the very first version of what is going to be the “converter” – the tool that will convert exported txt-files to .al-files.

No PowerShell

I was hoping for a simple extra PowerShell CmdLet, part of the “Microsoft.Dynamics.Nav.Model.Tools” module – or any other. But no .. no PowerShell.

And while some might still think that “PowerShell” is a “step back” – well – for this tool, we actually need to take a step back. This tool has to be executed in .. uhm .. are you sitting down? .. If not .. brace yourself .. In order to use this tool .. we need to use .. the command line.

PowerShell

I actually refuse to take that step back. If you know me a little – I’ll try to make it easy on myself – and make it part of my toolset that I’m comfortable using.

And I really didn’t want to put too much time in it – so I only created a draft version – which tend to work quite well (the reason is that I do expect Microsoft to come up with a PowerShell version of this tool as well).

I remembered that back in version 2013, when we didn’t have the out-of-the-box function “Export-NAVApplicationObject”, that I created some functions in PowerShell to do just that. It was basically a wrapper around the existing command line option to export objects, using the finsql.

I decided to copy most of that business logic into a new function, and extend it with also calling the new txt2al.exe (which you can find also in the “RoleTailored Client” folder – at least in the current TENERIFE version). The end-result is a somewhat-messy-but-it-does-the-trick-function that I decided to call “Export-NAVALfromNAVApplicationObject“. You can find it on my github.

The flow

And this is basically how it works:

  1. First I need to export the objects from a certain database, identified by the serverinstance. But I can’t use the default function for that, I need something else, because Microsoft introduced a new “command” to the finsql that I need to call. While the out-of-the-box command uses the “exportobjects” command, in this case, I need the “ExportToNewSyntax”.
  2. When exported, I need to split the objects. Basically, because I might have exported multiple objects, and I need to convert all of them to individual .al files :-). But also because the txt2al.exe expects a source-folder, and not a source-file.. .
  3. As last step, I call the txt2al.exe, and convert all the files to the result-directory .. and show this directory (either in VSCode or just in a file explorer)

And here is how you can use it…

Just try for yourself .. here is an example which you can execute straight on the NAV Developer Preview VM. It installs my modules, and executes the conversion for all “wizards” in the database of the ServerInstance “NAV”, and shows the converted files in VSCode:

 

Find-Module | where author -eq waldo | Install-Module

Export-NAVALfromNAVApplicationObject -ServerInstance NAV -WorkingFolder c:\WorkingFolder -TargetPath c:\WorkingFolder\Result -Filter 'Name=*Wizard' -OpenResultFolderInVSCode

 

Happy days :-)

Enable GIT on your NAV Developer Preview

$
0
0

When doing sessions about VSCode – one of the main advantages people always talk about (in comparison with our classic C/AL development environment (aka C/SIDE)), is “GIT Integration”. Or “Source Control Management right out-of-the-box”. I would even say that it’s difficult to work with VSCode without actually controlling your software.

Out-of-the-box GIT Support

VS Code ships with Git. So, when you install VSCode, you have a bunch of Git-stuff right under your fingertips. Basically all you expect from a source control management system, is just there! Clone a repo, branch, commit, pull/push to remote, … !

Now, this post isn’t about what Git is, nor how it works, .. even not on how git works together with VSCode. Because there is a bunch out there for you to get information about that! A very good starting point is simply the VSCode site about “version control”, which you can find here: https://code.visualstudio.com/docs/editor/versioncontrol.

It gives you an overview on what you can do with Git in VSCode – and refers to some other SCM Extensions in VSCode as well. Very interesting!

This post is about how you can get this to work on the “NAV Developer Preview”. You know, the preview of the new developer tools, the new way on how to build extensions, Extensions V2, .. ? Never heard of it? Read here (and be ashamed! ;-)).

How to get GIT working on the NAV Developer Preview

Well, basically, the support of GIT is in there, but you need to install the binaries of GIT for it to work. So that needs to be done first. But it’s not easy on the NAV Developer Preview VM on Azure, because it basically doesn’t allow you to download files with explorer.

Basically, you need to get the downloads from http://git-scm.com/. But you will see this error when you try to do that:

Easy, right? Just add that to your trusted sites.. . Well, think again. After some investigation, you’ll see it tries to download from github. Adding both github as this git-scm to your trusted sites will not help. You can however start to mingle with serversettings – well, I mingle with PowerShell. So, you basically have to figure out what the real download is, which you can find here. Just copy the link that says “click here to download manually”. That one should give you a “github”-link. Like the current one is: https://github.com/git-for-windows/git/releases/download/v2.13.0.windows.1/Git-2.13.0-64-bit.exe .

You can download the file on some other machine and copy it to the VM .. I download the file simply with PowerShell. Here is a script you would be able to just execute, and it will install git for you as well:

 

Invoke-WebRequest `
-Uri 'https://github.com/git-for-windows/git/releases/download/v2.13.0.windows.1/Git-2.13.0-64-bit.exe' `
-OutFile C:\DOWNLOAD\Git.exe

$Command = ' C:\DOWNLOAD\Git.exe /VERYSILENT'
cmd /C $Command

That’s it! In a way, you’re good to go now. Any new project that you start, just “Initialize Repository”

Although ..

Your first commit will fail

You’ll see when you want to check in, you need to set up “yourself” by providing username and email to git. Which kind of makes sense, since you haven’t identified yourself yet. The easiest way is to execute these commands in the terminal:

git config –global user.email “you@example.com”

git config –global user.name “Your Name”

Like:

And yes, now you ARE good to go. At least for the most basic things. As said, I’m not going into that.

Github

So what about Github? Or what some call “remote” repositories. Well – VSCode supports that as well – and it’s easy. What I always do (and was thought by Soren :-)), is to clone a repo from github and go from there. I follow these steps.

  1. CTRL+SHIFT+P
  2. Enter “>git: clone”
  3. Now you need the URL. Let’s just take the very repo that Microsoft uses to get information and such: https://github.com/microsoft/al
  4. Next, you need to give a “Parent Directory” – basically where the files will be stored, like “C:\Repos”

The files will be downloaded to that folder, and you’re good to go to contribute. You can commit, and push the changes to github.

And that’s how you can get going on the NAV developer preview. So, don’t just try out the new “al” language .. also get familiar with VSCode – and its integration with GIT when you’re at it!

AL with VSCode: Symbol Rename

$
0
0

When you’re a NAV developer, you’re not really spoiled with developer tools. So these are exciting times .. and I think I can speak for many developers when claiming I’m very much looking forward to the upcoming developer tools.

Basically because it brings us VSCode .. and VSCode brings us a grown up development experience.

Symbol Rename

On example is the “Symbol Rename”. You can do that by simply press F2 (or right mouse click / Rename Symbol), and give the symbol another name. It will rename that symbol accross files – and yes, also when the file is not opened.

Don’t believe me?

Let’s look at this example. It’s the “Hello World” sample on the NAV Development Preview VM, where I’ll rename a few functions – one that is used within the file itself, and one that is used in another file:

36037B75-A783-4602-AFBF-971C2B909940

Cool stuff!

AL with VSCode: Run Extension Objects (with PowerShell)

$
0
0

Some time ago, Mark blogged an interesting article about running objects from extensions. You can read it here: https://markbrummel.blog/2017/05/20/tip-58-run-extension-objects/ .

It made me wonder .. how would I solve this? And one of the problems of asking myself these kind of questions is .. the answer is always the same :-/:

Powershell

And apperantly, I already blogged something really similar as well :-):

http://www.waldo.be/2015/07/09/start-nav-object-in-rtc-with-powershell/

I created some functions in my module to be able to run objects in a windows client. This enables me to do something like this:
Running a codeunit:

Start-NAVApplicationObjectInWindowsClient `
  -ServerInstance NAV `
  -ObjectType Codeunit `
  -ObjectID 70000000

Running a table:

Start-NAVApplicationObjectInWindowsClient `
  -ServerInstance NAV `
  -ObjectType table `
  -ObjectID 70055050

Works like a charm for Extension objects as well :-).

What about the Web Client?

Well .. Duh ;-). Only, the Web Client only supports to be able to run a Page or a Report. So in the latest version of my Powershell-modules, I included a function to support that: Start-NAVApplicationObjectInWebClient.

It’s very the same as the other Start-function – here is an example:

 

Start-NAVApplicationObjectInWebClient `
  -WebServerInstance DynamicsNAV100 `
  -WebClientType Tablet `
  -ObjectType Page `
  -ObjectID 700500000

You can see it supports also running the tablet client and the phone client. How easy can it get?

So, I need to have a new project with powershell-stuff to combine with my AL development?

Most certainly not! You can easily combine powershell-scripts with al-development! Just take this screenshot as an example:

061317_1622_ALwithVSCod1.png

This is just a simple example where I included a PowerShell-script in a subfolder of the workspace of an al-project. It keeps being an al-project (mind the “launch.json”, but I’m able to execute Powershell as part of the workspace. Not with F5, but with F8 (execute the selection).

How do i make your modules available on my system

Making my modules available on your machine you can do simply by installing them from the PowerShell Gallery by executing this :

Find-Module | where author -eq waldo | install-module

Enjoy!

Viewing all 339 articles
Browse latest View live