Solved – corrupted index in a managed solution

In my last article, I mentioned that we got stuck with a corrupted index in a managed solution. The reason the index became corrupted was that the width of the index we defined was exceeding the maximum with of an index within SQL Server. The hard lesson we learned that we should create small keys in which the size of all fields do not exceed 900 bytes (link).

On the development server we could repair the situation by dropping the keys we defined. On the target environments on which we deployed the solution the faulty key could not be deleted. In fact we were not able to update the managed solution. We got stuck.

We requested Microsoft to alter the created indexes, making the keys smaller. After a couple of requests (one request per environment) the indexes were made smaller (in case of an on premise environment we would have done the index repair ourselves). The environments were ready to be fixed…

Hostingsolution

From there on, we did the following steps per environment to get rid of the indexes in the managed solutions.

  1. Alter the solution on the development environment, by removing the keys we defined.
  2. Make a managed export of the altered solution.
  3. Copy the exported managed solution and rename it.
  4. Open the renamed solution file, and alter the unique name of the solution in the solution.xml.
  5. Import the altered and renamed solution in the target environment.
  6. Drop the orginal solution from the target environment.
  7. Import the managed solution (created in step 2).
  8. Drop the altered and renamed soluton in the target environment.

This looks like a large number of steps. In fact it is. But it is the only way we could drop the corrupted index on the target environment, preserving all data!

The trick is by installing the renamed solution (step 5) we preserved the data (same published). By uninstalling the original solution, we dropped all modifications made by that solution (including the corrupted index). Because the renamed solution contains the same entities, the data is kept in the new entities 🙂

Repeating the action by installing the altered solition (step 2) over the solution made in step 5, the data will be preserved in the entities defined in the altered solution. By removing the renamed solution, the system reverts back in the state where it should be.

Case closed!

Stuck – corrupted index in a managed solution

Sometimes you have these big walls you run into. Today, it was my turn.

In the project I’m working on I finished an user story in which I had to add a new action to one of the products we are building. After having tested the action, it was time to export the solution in order to import it as a managed solution on the target environment.
On the target environment we are building a custom solution for one of our customers. This custom solution requires the action I implemented in the standard product.

I opened up the solutions page in the target environment in order to perform the update on the solution. The import wizard started and ended pretty soon with an exception.

Stuck2

The current version of the product contains a key definition that is too big to be stored as an index on SQL Server (more than 900 bytes). *ouch*

When I look in the already installed managed solution I see that there was a problem creating the keys (Status: Failed). Someone must have overlooked this.

Stuck3

When I try to delete them I get the error message indicating that I cannot remove the key because it is a managed key.

Stuck4

When I try to remove the keys using the object model, I get the same message…

Stuck5

Whatever I do, I’m stuck!

It turns out that in the System tasks, there is actually a large number of failed index creation tasks

Stuck6

I stumbled on an old blog post on MSDN by Gold Coast – dated 29 May 2012 –  in which he described a walkthrough to remove items from a managed solution.
This non-supported walkthrough involved juggling with solutions but finally you were supposed to end up with a fixed environment.

Unfortunately for me this didn’t work out on CRM Online 2015 update 1. The import process stops at the point where it is stacking the managed solutions as there is an error in the existing index definition.

What makes me wonder: How is it possible in the first place that the index was added to CRM. Shouldn’t there have been any verification if the size of the key doesn’t exceed the index width limit in SQL Server?

Some of you probably think: “Big Deal! just uninstall the managed solution and install the updated one”.

Normally I would perfectly agree with you on this matter…  Normally….

But the solution has already been deployed onto 10 different online environments including the production environment (which is containing actual data).

I guess it is time to call Microsoft Support, to see if they can come up with a solution. In the meanwhile we have to look for a way to get rid of the corrupted indexes in a more or less supported way. As soon as we found a solution, I’ll inform you about it.

If anyone of you have a suggestion, please feel free!

I would really appreciate it.

Debugging 101: The Pitfall of the plugin profiler

When developing for CRM, you sometimes have to debug your code. With the CRM on-premise versions, one could simply attach the Visual Studio debugger to your CRM Server.

You only had to install the Visual Studio Remote debugging tools, update the plugin, copy the corresponding plugin pdb file to the assembly bin folder on the server, attach the Visual Studio debugger to the CRM processes on the remote server (either the W3WP, sandbox or async processes) and you were good to go.

Back then, debugging was a breeze! Nowadays using CRM online, debugging is much harder.
Instead of connecting the Visual Studio debugger directly to the one of CRM processes on the remote server, we attach the debugger to the plugin registration tool in which we recorded a profile using the profiler. Before we can do that we have to install the profiler on the CRM organization to which we are connected, and after that attach the profiler to the plugin steps you want to debug.

Profiler1

Then we have to perform the desired actions in CRM in order to record the profle, before we can playback the recorded profile. The recorded profile is a sort of snapshot containing all the calls and values of the activity recorded in the plugin.

From the blog of the great Hosk, I got this quote:

Replaying plug-in execution does not require a connection to a Microsoft Dynamics CRM server and organization. The advantage of this method is that you can obtain the plug-in execution profile from a customer and debug the plug-in remotely. A restriction of the replay feature is that you cannot change the sequence of calls your plug-in code makes in the debugger while you are debugging.

The replay feature provides the plug-in with a snapshot of the call data and event execution context from the Microsoft Dynamics CRM server. You are getting the same events, GUIDs, and so on from calls to the Organization service but no data is being modified on the server as you debug the plug-in. During the debugging procedure in the previous section, the plug-in actually connects to the server and makes calls in real time.

I hear you think!  This is awesome!

Well sort of, actually in a lot of cases it is. You can replay the actions as often as you want. No data is saved to CRM (this is great), however…. the data during the replay is fetched in real time from CRM. * uh oh *

This caused me a big headache last week in which I had to resolve a bug on a plugin in which I had to update a field of an entity.
During the replay of the profile I saw data being fetched from CRM and put into a field of the entity. Once the plugin finished the execution, the data I put in the field was not saved. I was baffled!

How was this possible?

Putting trace statements in the plugin finally lead me to the solution. It turned out that in my case I was assuming that an association between two records was already in place; in fact it needed to be created at that point of time. While replaying the profile the association was already in place, resulting in a value (while at the real point of time, there was no value).

In my eyes, the behaviour of the plugin profiler is not correct. The plugin profiler should never make calls in real time to CRM. By making the calls to CRM, the data you see during the replay of the profile does not represent the data in CRM at the point of time of the plugin execution. This can be a big pitfall!

I Learned a valuable lesson the hard way: don’t never ever trust blindfolded the data you see in the plugin profiler! The data is being fetched in real-time!

As long as you keep that in mind, the tool can be pretty useful!  😛

Using Azure Table Storage from within CRM

In my last article I described a scenario in which large amounts of data were moved from CRM to Azure Table Storage. The idea behind this operation is to offload inactive data to an alternate storage, keeping it accessible and keeping CRM lean and mean.

Today I had too much time to waste, so I started to experiment. The goal of the experiment was to find out if Azure Table Storage can be hooked up easily to CRM.

In this article I’ll share the things I learned from this exercise.

For this experiment I decided to write a useless plugin that is triggered by the RetrieveMultiple message on the account entity. All records fetched are written to Azure Table Storage. * why?  don’t ask *

The things that I had to investigate were:

  • How do I manage my Azure account?
  • What tools can I use?
  • How do I write to Azure Table Storage?
  • How can I retrieve data from it?
  • How do I integrate the Azure code in my plugin?

Continue reading

Food for thought: A scenario for handling massive amounts of data – part 2

In my last article I described a scenario for handling massive amounts of data. The scenario intrigued me. How could I implement a scenario like this?

On one side you find Dynamics CRM –  a massive online environment. On the other side you find Azure with an ever growing and maturing cloud platform. The challenge we are facing is to move data from one (cloud) platform to another.

In this article I’ll describe how I would implement this scenario.

Ats1

Continue reading

Food for thought: A scenario for handling massive amounts of data

Today I overheard a very interesting discussion at the office.

For one of our customers we’ve built a massive CRM implementation based on CRM 2015 online. An implementation in which a large number of invoices is going to be created; at least 500.000 invoices per year.

In order to create those invoices, a massive amount of data in a large number of entities is required. Information that remain inactive in the system once the invoices have been created.

Millions and millions of inactive records will reside in CRM’s database: In my eyes a waste of resources.

The inactive data will clutter up the system; making it harder to navigatie, slowing down processes, eating up storage space. My prediction is that the performance of the invoice generation process will degrade seriously over time.

Continue reading

A case of inspiration: Locking down CRM for maintenance

Today we had a discussion at the office on an upcoming migration at a customer. Their Dynamics CRM environment is getting a complete overhaul and a large number of external systems need to be hooked up. A migration for which we are preparing multiple scenarios.

While the migration is going on, it would be really nice to lock out all users in a friendly way. Preferably with a nice message indication that the system is offline for maintenance.

News_maintenance201302

Dynamics CRM offers some methods to restrict user access while updating.

  • CRM On premise/Online: Disable almost all users
    Out of the box, simple, however prone to errors. You could lock out your own account.
    Locking out users can be done under Settings > Security > Users.

Lockout1

  • CRM Online: Administration mode
    When you place a Sandbox instance in administration mode only users with CRM System Administrator or System Customizer security roles will be able to sign in to that instance. Administration mode is useful when you want to make operational changes and not have regular users affect your work, and not have your work affect regular users.

The second method is the most efficient one and less prone to errors. However this functionality is only available on CRM online. The regular users may get confused why they don’t have access anymore. * And this made me wonder *

Why does CRM not offer a mechanism in which an administrator can lock the system for maintenance that informs the users in a nice way that the system is down for maintenance?

While driving back home, some ideas popped up in my head. Ideas that I want to share with you.

Continue reading

Hardcore CRM: Using C# to modify an entity forms

I thought this year would start at a normal pace. Unfortunately that was wishful thinking. Right now we are in the middle of finishing up a large project. In this phase of the project the two products we developed have to integrate with a tailor made solution for the customer.

In the process of deploying the two managed solutions and the unmanaged custom solution (with the customer specific implementation) we sometime run into the problem that some customizations (e.g. form event scripts) are not always wired up correctly.

For that problem we wrote a manual with a large number of post configuration steps. Steps that describe which entity forms we have to check and what events should be wired up etc. A tedious process if you have to do it multiple times.

At the office we were discussing if we should build a tool that handles the post configuration steps for us automatically. Unfortunately our project manager didn’t think it was a good idea at this point of time (writing such a tool can cost a lot of time and time is the resource we are lacking at this moment).

The idea kept haunting me. Tonight I couldn’t resist and fired up my Visual Studio environment and I started to experiment.
On my CRM environment I have a solution with a simple form. On that form I wired two events (form onload, field onchange) to call a javascript function.

Continue reading

Hardcore CRM: Birth of a C# CRM code generator

Almost year ago I started this blog. Today I’m writing the 100th article on this blog. For me this is a moment worth celebrating.

Looking back at last year, I dug up some statistics on JourneyIntoCRM.com that I want to share with you:

  • 1 year of blogging
  • 1 Blogger
  • drinking at least 34 litres of coffee while blogging
  • spending over 400 hours on 100 blog articles
  • articles containing over 70.000 words with an average of 125 Kilobytes of data per page
  • serving over 125.000 visitors and more than 600.000 pages
  • resulting in more than 72 Gigabytes of data traffic
  • consuming 192 KWh of power

It was definitely worth it…

For me a celebration is no celebration without any gifts. Therefor inspired by my last article, I decided to create a small gift for you.

Over the last days I’ve been working on an  intelligent code generator that you can use to speed up your software development projects. And the great thing is that I offer it as a free to use download (no strings attached).

Continue reading

Hardcore CRM: Playing with meta data

One of the advantages of being a software engineer is that you can automate your job. Let some handy tooling do a lot of the dirty work, allowing you to focus on stuff that matters.

Back in the days I used to use metadata to automate my job, which is writing code (below an example of one of my old generators).

Codegenerator

As soon as I could lay my hand on a solid relation datamodel, I used to write code to generate code. This resulted in solid and easy to maintain software projects.
A few weeks ago I was training some developers in extending Dynamics CRM. One of the tools being discussed in the training is the SDK tool CrmSvcUtil.exe. A command line tool that you can use to generate C# code, that you can use in your projects. A nice tool, however the tool is generating C# code for every entity available in CRM. Creating a lot of overhead.

Continue reading