Sonoma Partners Microsoft CRM and Salesforce Blog

The 10 Dashboards Every Project Manager Should Use

Today's blog post was written by Brian Kasic, Principal Consultant at Sonoma Partners.

In many types of business environments, using dashboards is a common theme for daily management. But have you ever used it to manage your team or group? I like thinking, “You can’t manage what you don’t measure!” As I became a program manager and supervisor of a large team, I really started taking this concept to heart and have used it ever since. However, it’s harder than it seems to pull off managing via dashboards successfully. Today’s blog is about the key elements I use on a daily basis to manage a multi-million-dollar project with a 75-person project team disbursed in multiple time zones across the world.

The ideal scenario is looking at dashboards targeted for each individual.

They show the most important items to accomplish for that day, tell you what to expect in the future, easily demonstrates progress for what’s been accomplished, and can also be run to create an instant status report. 

CRM systems are ideal to enable you to manage in this fashion. They are easily adjustable, intuitive, and simple to export data for deeper xls analysis. You can also create the underlying dashboard data and enable your dashboard to be drilled into for cases when you want to see a specific team or individuals progress. They can be secured to target various groups of executive management or be available to the entire team.

Here are my top 10 dashboards organized from the viewpoint of a program manager. They can be used for different members of team, such as the individuals in the nitty gritty details all the way to executive management have varying interest in each one of the themes. Each person plays a part in supporting the details that make them usable. The dashboards only work well if you have a process to consistently and accurately supply them with data and give feedback on what they are telling you. If not, you are out of luck. So before you start dashing around spinning up dashboards, please make sure your underlying data is consistent and clean. Also ask yourself, “Once I have this dashboard, how am I going to use it?” You should be able to take action on every dashboard you use. If not, question its value.

In the list below are key dashboard themes and the underlying ways they are used.

DASHBOARD TYPE #1: Resources

USE: Shows future staffing needs and if the team size should be increased or reduced.

ACTION: Moves resources from various areas of the project to maximize utilization.

DASHBOARD TYPE #2: Workload / Capacity

USE: Measures time estimates vs. time available.

ACTION: Adds more capacity to teams that need the assistance.

DASHBOARD TYPE #3:  Detailed Progress

USE: Shows past due tasks and ownership.

ACTION: Assists team members who have questions and identify problem areas within the project.

DASHBOARD TYPE #4: Project Blockers

USE: Assists in running the daily stand-up meeting to highlight items hindering progress.

ACTION: Identifies next steps to clear the impediments.

DASHBOARD TYPE #5: Communication / Status Reporting

USE: Generates a weekly status report with items accomplished for the past week and planned for the future week.

ACTION: Engages the team if areas are behind schedule or milestones were accomplished.

DASHBOARD TYPE #6: Escalations

USE: Highlights problem areas for upper management.

ACTION: Shifts priorities to address the items requiring management attention.

DASHBOARD TYPE #7: Daily Meetings

USE: Runs daily meetings with real-time data.

ACTION: Makes edits on the spot. Eliminates the need to take detailed meeting notes if all the data exists in the dashboards and is real-time within the system.

DASHBOARD TYPE #8: Variance Analysis

USE: Explains why variations happened.

ACTION: Takes action on variances that seem unusual.

DASHBOARD TYPE #9: Risk / Mitigation

USE: Shows risks.

ACTION: Documents plan to mitigate or reduce the risk area.

DASHBOARD TYPE #10: Roadmap / Project Plan

USE: Visualizes for the project team what is upcoming and gets visibility to the schedule.

ACTION: Updates the plan if circumstances change.

Hopefully in seeing my top 10 dashboards you can more effectively manage your projects and teams. If you ever have questions on this or other CRM management challenges, drop us a line!

Is your CRM

Topics: CRM Best Practices

Import and Export Better than Art Vandelay

Today's blog post was written by Nick Costanzo (Principal Consultant) and Nathan Williams (Consultant) at Sonoma Partners.

If you've ever had to use the native import tool for Dynamics 365, you've more than likely had the experience of running into import errors of some sort. These errors are not always easy to resolve, and if you're importing large volumes of data, sorting through the errors can be very time consuming. Here at Sonoma Partners, we've had situations where client import files have 50k+ records and have resulted in thousands of errors on the initial import into a test environment. Dynamics 365 offers the ability to export the errored rows, but it doesn’t include the error codes. The export only includes the rows with the data you had already included in your import, which is not very helpful. Furthermore, you cannot access the error logs through Advanced Find.

Our team set out to find a better way to tackle this situation using Power BI.

Through our efforts, we came up with the following approach to better analyze these errors and resolve them more quickly. After all, we don’t want you to start yelling, “George is getting angry!” while dealing with import errors.

Here’s the approach we took:

1. First connect to CRM by choosing Get Data > OData Feed:

Nick c 1

2. Then choose the Import Logs and Import Files entities.

3. Next pull in the Web Service Error Codes published on MSDN, by choosing Get Data > Web:

Nick c 2

a. Note: Power BI will recognize the table of error codes on this page, but you will need to massage the data to get the Error IDs and text into their own columns:

Nick c 3

4. Now you can create your data model with relationships between these 3 tables:

Nick c 4

5. With your relationships in place, you can now create a report with visualizations to categorize your errors:

Nick c 5

  1. Create a slicer for the Import Date.
  2. Create a slicer for the File Name, in the event you have multiple files to import.
  3. Create a slicer for the Target Entity.
  4. Create a bar chart to count the errors per Field and Error Name.
  5. Create a bar chart to group the error by the field Value (i.e.  GUID from the source system).
  6. Create a table to display the record(s) based on which slicers have been selected.

6. The report now allows you to easily focus on which errors need to be fixed. In this case, we can see that 2 records were responsible for 1468 and 305 errors where the lookup could not be found. By fixing these 2 values, we’re much closer to a clean data set and can move on to the next ones.

7. Once you have resolved all errors in your source files, you can now reimport with a much higher level of confidence that the job will be successful.

If you wanted to take this a step further, you could set this up to analyze your data before importing to make sure it's clean. You would need to setup your lookup tables as data sources, and update the data model with those as well.  If you’d like help with these feel free to contact us, and our Power BI team would be glad to help!  Either way, you can certainly do more importing and exporting than Art Vandelay ever did!

Download our infographic on D365 for manufacturing

Topics: Microsoft Dynamics 365

Data Migration Testing 101

Today's blog post was written by Sid Thakkar, Senior QA at Sonoma Partners.

The concept of the data migration is very simple; testing is conducted to compare the source data to the migrated data. In other words, we try to discover any discrepancies that take place when moving the data from one database system to another. As simple as it might sound, the testing effort involved in data migration project is enormous, and it often ends up taking a lot of time.

A well-defined testing strategy is essential for delivering a successful data migration.

One of the important aspects of a successful data migration test can be archived using an “Automated” approach of testing. It also saves significant time, minimizes the typical iterative testing approach, and gives us the ability to test 100% of the migrated data. Different phases of data migration testing include:

  1. Data Migration Design Review
  2. Pre-Data Migration Testing
  3. Post-Data Migration Testing

Data Migration Design Review

It is important for a Quality Analyst to understand the design review of the migration specification during the early stage of the migration implementation/configuration. The QA should go through the detail analysis of Data Mapping requirement document prior to the start of any sort of testing. Ideally, we would want to note if any of the columns or fields match the below criteria.

  1. Change in data type from source to target (e.g. data in source may be represented as a character but in target table the same is represented as an integer)
  2. Modifying the existing data (e.g.  requirement of migrating “status = in progress” in source system to be migrated as “Status = lost” or “telephone = 1234567890” to be migrated as “telephone = 123-456-7890”)
  3. Document all Option Set values, lookups, and user mappings

Pre-Data Migration Testing

Before we jump into any kind of data testing, one should test source and target system connection from the migration platform.

Pre-Data migration testing can also be called Definition testing. Definition testing is something that doesn’t take place during the data migration testing. During definition testing, we should check the Data type and length of all fields in Source Database table to target. For example, Address_line1 field in source is of data type Varchar and has length of 50 whereas Address_line1 field in target is listed as Varchar(30). This basically means that there can be a potential issue with the data that has a length more than 30 in source table.

For each entity, run a similar SQL query to the one listed below for both source and target table in order to confirm that the definition of fields between both tables are correct.

Sid 1

Post-Data Migration Testing

Post-data migration testing is by far the most important phase of the migration testing. In a situation where we do not have enough time assigned for testing, we can directly jump into this phase of testing. The testing is divided in two parts:

  1. Record Counts
  2. Data Mapping
    1. Unmapped Record Counts
    2. Unmapped Record Values

This could be really easy to test once you understand the data structure of the migration process. In order to successfully automate some of the testing, you will need to find out database names, table names, primary Keys for the entity you are testing. For example, let’s assume that you are testing account migration, and the source table name is “Source_Accounts,” the target table name is “Target_Accounts,” and the primary key for both the table is “Account_ID.”

Record Counts

I prefer using Microsoft Excel to automate some of the testing. But you can write programs to do the same. As you can see in the image, I have listed source and target table names, columns and primary key in “sheet1” of an excel file.

Sid 2
Image 1

You can create a new excel sheet and write this command to auto generate record count queries (see image below).

="select "&Sheet1!B5&" = count ("&Sheet1!B5&") From "&Sheet1!$A$5&" where "&Sheet1!B5 &" is not null"

Sid 3
Image 2

select Address1_AddressId = count (Address1_AddressId)
From Project_Database.[dbo].[Source_Accounts]
where Address1_AddressId is not null

Next step is to run these queries in SQL window, and then store the result. Once you repeat the same process for target table, you should be able to compare record counts for all fields between the source and target tables.

Data Mapping

Once we have done the row count testing, we can go one step further to verify if the content matches as well. During this phase of the testing, we basically will cover all the testing we have done so far (which is one of the reasons why we jump directly to the data mapping testing in time-crunch situations).

Unmapped Record Counts

Let’s use the image1, create a new tab in the same excel file, and write below listed command to auto generate data mapping queries. It’s easier and safer to first find out the record counts that did not match and then dive into finding those records. Counting unmapped records is the first step towards this process.

="select count(*) From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <>  t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 4

Sid 5

Unmapped Record Values

If the above query for unmapped record count returns zero for all fields, then the possibility of a successful migration is greater. But it isn’t really wise to leave the testing efforts just yet. I highly recommend that regardless of the result of above queries, one should go a step further and run below query to find out exact value mapping between source and target table.

Let’s use the image1 again and create a new tab in the same excel file to auto-generate the query for unmapped record values.

="select t1."&Sheet1!B5&" , t2."&Sheet1!E5&" From "&Sheet1!$A$5&" t1 join "&Sheet1!$D$5&" t2 on t1."&Sheet1!C$5&"= t2."&Sheet1!$F$5&" where t1."&Sheet1!B5& " <> t2."&Sheet1!E5& " and t2."&Sheet1!E5& " is not null"

Sid 6
Sid 7

 

In the next blog, I will be discussing how a QA can be involved in writing SSIS packages to be more self-dependent during any sort of data migration projects.

Topics: Microsoft Dynamics 365

Customer Success Story: Building A Stronger Foundation with Salesforce

Before deploying Salesforce, Feralloy lacked a centralized method for tracking and managing existing client relationships. As one of the country’s largest high volume steel processing plants, Feralloy needed a customer relationship management system that could measure up in equal strength to the materials they manufacture.

Feralloy

Who is Feralloy?

Feralloy Corporation consists of an extensive network of plants throughout the United States and Mexico. They bring over 60 years of experience to the table as they deliver high quality processed flat rolled steel.

Supporting Customization

Before CRM, Feralloy managed business operations through a myriad of Word documents, Excel spreadsheets, and verbal and email communication. This system created many problems for their organization, one of which was maintaining data after an employee departure. When a member of their sales team left, so would their customers’ information. Furthermore, without a formal process for managing customers, sales processes differed between business units, making communication and collaboration very challenging. Overall, the sales organization lacked the tools needed to adequately manage existing client relationships and nurture prospective business.

Building Systematic Inventory Management

Sonoma Partners helped Feralloy get up and running with a custom Salesforce deployment and an integration between CRM and their ERP System, STAR (AS 400). The integration allows Feralloy to more accurately document their inventory and assist customers in finding what they need when they need it. By more effectively overseeing their assets, Feralloy drives profitability for not only themselves but also their clients.

Facilitating Employee Management

With improved visibility into team performance, CRM allows Feralloy to more productively manage their employees. Knowledge management tools and streamlined onboarding process helped employees work strategically in their new environment, no matter what their experience in the industry. 

Going Mobile

Feralloy’s outside sale team can be on the road up to four days a week, working multi-state territories. By leveraging Salesforce1’s native capabilities, Feralloy’s sales teams can view account and territory management in the field. Since investing in a mobile strategy they’ve seen an improvement in the accuracy of their data due to their team having a more efficient way to enter it on-the-go.

Interested in a custom CRM system to call your own? Let us know.

Is your CRM

Topics: CRM for Manufacturing Salesforce

Tips on Submitting a Dynamics 365 App to Microsoft AppSource

Microsoft's updated business app store, AppSource, has launched and as we mentioned before has been steadily gaining  momentum. We’ve submitted a few apps to the store for the Dynamics 365 CRM product, and I to share some tips to get through the evolving process more efficiently.

Note: Developing an app for AppSource is outside the scope of this article. Instead, I will focus on the submission process once you have a managed solution developed and ready to submit.

 

Process Overview

The app submission process encompasses more than just your Dynamics 365 managed solution file. Microsoft AppSource expects you to have your marketing, support, and image files ready for submission in addition to your solution. Because of this, please consider the following tips:

  • Start your marketing efforts in parallel to your solution packaging efforts, this includes the creation of marketing data sheets, product images, and application icons.
  • Application Icons
    • Icon & image sizing should match exactly the sizes the submission process requests.
    • Try to create all of the app icons requested.
    • Your solution package requires a 32x32.png logo file. Don't forget to get this completed, otherwise, you can't complete the solution process.
  • You will need an Azure subscription to store your solution package for the submission process to retrieve and test it. Use this handy tool for this process.
    • Note: Microsoft hosts your final solution file for AppSource. This is a temporary location for the submission process to evaluate your solution prior to publishing.
  • You will need to have a license, privacy statement, and supporting marketing data sheet documentation.
  • Don't select CSV for your lead source. This will create daily Excel files and they end up being difficult to manage. Since you have an Azure subscription for the file storage, you can use Azure table or select your cloud-based CRM system.
  • The AppSource review team will send you a document to complete the end-to-end testing steps. This will happen during the process, so be prepared to see it and send to them when requested.

Solution Packaging

You need to take your managed solution file and 'package' it using the solution packaging tool. Follow the article steps for more detail, but the part that might confuse a Visual Studio novice (like me) was the part to update your references. Here are the minimum steps you need to get a packaged file ready to zip.

  1. Assuming you have installed the package from the link above, Create a new CRM project
  2. Click References, right-click and select Manage NuGet Packages
  3. Click Updates and select all and update (this will update your references with the latest files from NuGet)
  4. Copy your managed solution to PkgFolder
  5. Update ImportConfig.xml with package name (and any other settings necessary)
  6. Build and note the location of your debug output file

AppSource Packaging

Microsoft provides you with detailed instructions for this process. This is a lengthy document, so here are the steps I take when preparing an app for the store submission.

First, the sample template zip file originally sent to me was incorrect. It fails to include the required ImportConfig.xml in the PkgFolder. And, while not a mistake, you don't need the privacy.htm file included. Here are the steps I take AFTER I have the solution package built from Visual Studio.

  1. Need to have a 32x32 logo file! Be sure to get that ahead of time and the size must be 32x32.
  2. Create a folder called Package.
  3. Copy the dll and PkgFolder files from your debug build.
  4. Inside the PkgFolder, delete the /en folder. The only two files necessary are the managed solution zip. and the ImportConfig.xml file.
    BE SURE THE ImportConfig.xml file is properly updated with your values.

    image2017-2-16 16-12-43
  5. IMPORTANT: Add a the content_types xml file! Grab this file from the template folder (or a previous submission).
    image2017-2-15 14-58-28
  6. Zip this and call it Package.zip. Be careful when you zip you don't get the parent folder. The inside of the zip should match the screenshot of step 5 exactly.
  7. Create another folder (I usually name it AppSource_<AppName>).
  8. In this folder, copy your Package.zip file you just created.
  9. Add the content types xml again, a license file, input.xml file, and the logo. All of these files are required.
  10. Be sure to update the input.xml file with your specific settings.

    image2017-2-15 15-0-33

    image2017-2-15 15-0-57
  11. Zip up the contents of this folder. The zip file will need to be placed on Azure and then a url created from to Azure storage which will be entered into the AppSource submission request.

Wrap Up

The process of loading your app to the AppStore may appear intimidating at first. However, Microsoft and your service partner can assist you throughout the process and Microsoft is continuing to improve the entire submission process experience. While these tips don't cover every step required, hopefully they provide a jumpstart to some of the more common missteps we see.

Topics: Microsoft Dynamics 365

Luxury in the Cloud: Preferred Hotels & Resorts℠ Moves to CRM Online

Preferred Hotels & Resorts markets over 650 distinctive hotels, resorts, residences, and unique hotel groups across 85 countries. They work with independent hotels to help them market both to individual travelers as well as organizations looking for locations to host large events and conferences.

Preferred-hotels

Preferred Hotels & Resorts used Microsoft Dynamics 4.0 On-Premise for many years. Michelle Woodley, Executive Vice President, Distribution & Revenue Management at Preferred, and team worked with Sonoma Partners to upgrade the long-outdated system. With an upgrade to Microsoft Dynamics 2016 Online, Preferred Hotels improved the user experience and interface of their system, took advantage of native functionality in place of overly complex customizations, and incorporated mobile functionality.

It was our pleasure to sit down with Michelle for a Q&A session on the progress of the project as Preferred Hotels & Resorts continues to roll out their new CRM solution to their users.

What motivated you to upgrade from Dynamics 4.0 On-Premise to CRM 2016 Online?

Michelle: Dynamics 4.0 On-Premise was simply not cutting it for us. It was very outdated, and we hadn’t touched it since its implementation in 2010. We recently upgraded many of our other systems, the biggest of which was the move to Office 365. Upgrading our CRM system to the cloud fit nicely within that plan.

Why is CRM important to your sales process?

We use CRM on our sales side to target multiple audiences. Hotel Development – selling to hotels – is one of our main sources of business. The second is Group Sales, who targets meeting planners to help them book hotels for their events. The third target includes the transient sales people for corporate and leisure, who work with accounts to bring further business to the hotels. These three different groups are fairly separate in who they’re targeting and in their day-to-day operations; however, to have all three in the same platform is incredibly important from a reporting standpoint and for integrations into other systems, such as billing and commission structures.

“Without a doubt, there is no better tool than CRM for our sales organization to use to do their work effectively. CRM is integral to our operations.”

From an organizational perspective, we use CRM as a tool to protect ourselves. When someone leaves our organization, we know that their data stays with us in our system. Additionally, workflows in CRM help our sales team track projects throughout their lifecycle and informs management on the progress of these opportunities. Within each account, we can track related opportunities to give management and sales a complete picture of each customer in every step of the process.

What steps have you taken to ensure successful user adoption as you continue to rollout the new system?

We’ve found that a big part of ensuring successful user adoption is in paying attention to who is and isn’t adopting. We have some more senior sales people who are stuck in the way things used to be done, and we’re making sure we provide the training and resources they need to feel confident in using the new platform. From the beginning, it was crucial that we got our executive team fully onboard with the upgrade. This continues to be helpful as we rollout our new solution.

That being said, our biggest teaser to encourage our sales team to adopt the new system was mobility. They love having the ability to utilize CRM easily from their mobile device. We encourage them to look forward to future integrations as we continue to get off the ground and into the cloud.

“Mobile was our teaser to encourage people to adopt the new CRM system. People were very excited to have on-the-go access to the system and their account data.”

What is the biggest value-add of CRM to your business?

By far, the biggest value-add of CRM for us is the 360-degree view of the customer. We are a global company with over 250 employees around the world and offices in 20+ countries. No matter which office you’re in or where in the world you are, you can have complete visibility of a customer through CRM. I’ve had it happen to me when someone calls and asks me about an account. I can quickly go into the system and identify what role they play at the hotel, which events they’re going to, which marketing campaigns they’re on, etc. It’s a huge value-add and allows us to work more effectively together as a cohesive organization and better serve our customers.

What is it like working with Sonoma Partners?

We had a great team. I felt Sonoma Partners really understood our business and our needs. I always had the impression that they were on the same wavelength as us, knowing our pace and who to go to with questions. It was a fantastic partnership, and we look forward to continuing to work with them post-implementation.

Our many thanks to Michelle and Preferred Hotels & Resorts for sharing how the upgrade went and their progress thus far. We look forward to reporting back once they finish rolling out their new system!

Are you considering an upgrade to the cloud? Please contact us

New Call-to-action

Topics: CRM Best Practices

Introduction to Dynamics CRM and Salesforce Data Migrations Using SQL Server Integration Services

Today's blog post was written by Rob Jasinski, Principal Developer at Sonoma Partners.

There are many occasions when you’ll need to migrate a large amount of data from some outside data source into your Dynamics CRM or Salesforce environment. Both systems do have native import tools, but they have many limitations and can be difficult to use on a large amount of data. If your company owns SQL Server licenses, those also come with SSIS licenses and that can be a viable alternate solution to migrate data into CRM. This article will give you an introduction to setting up a simple data migration to get you started for creating your own CRM migration solutions.

First you’ll need SQL Server Data Tools. This can be downloaded from this link. You should have Visual Studio installed, but if you don’t this will also install a light version, enough to run data tools and allow you to create SSIS integration package.

Next, since you’ll be migrating data to CRM, you’ll need a third party destination component. Now you can write your own custom code, either in C# or VB, and code directly to the API of the destination CRM system, but this can be tedious and very time consuming. These third party tools encapsulate all the code into easy to use drag-and-drop components. They are very reasonably priced and will save a lot of time and frustrations. There are many third party components you can use from vendors such as CozyRoc, CData, and KingswaySoft. For this example, we’ll be using KingswaySoft, but the other vendors would work fine as well. You can download the component used in this example from this link.

Once you have data tools installed, launch Visual Studio and choose new solution and choose “New SSIS Package."

Rob j 1

This will create an empty canvas ready for you to start creating your first SSIS data migration.

Rob j 2

Next you’ll need to create a data source. This is the location of the source data that exists outside of your current CRM solution. SSIS has several options to pull data from many sources, including Excel, SQL Server, ODBC, CSV files, etc. For this example, we’ll assume our data is located in a SQL Server database. So in the Connection Managers area, right click in that area and choose “New OLE DB Connection.” Configure the server name, credentials, and database name, and click OK. In this example, I’ve renamed the connection to “Source.”

Rob j 3

Next drag and drop the “Data Flow Task” from the SSIS Toolbox onto the Control Flow canvas.

Rob j 4

Double-click the Data Flow Task just created to open the Data Flow canvas. This is where you’ll create the actual migration using source, destination, and transformation components (if needed).

Next drag and drop the OLE DB Source component from the SSIS Toolbox onto the Data Flow canvas and configure the component along with the query to pull the data that you need. In this example, we’ll be pulling basic account information from some outside system into our Dynamics CRM environment.

Rob j 5

Now you’ll need to configure a Destination connection. In this example, we’ll be using the KingswaySoft Dynamics CRM adapter for connecting to our CRM environment. Right click in the Connection Managers area and choose “New Connection.”  From the Connection list, choose DynamicsCRM and click Add. Also, you must have already installed the KingswaySoft Dynamics CRM adapter for it to appear in this list.

Rob j 6

Next you’ll need to configure the CRM connection. Choose the appropriate Authentication Type, depending if you’re online or on-premise. Then enter the CRM Discovery URL, then any credentials and finally choose the appropriate organization name from the drop-down list. Since there are many environment configurations you may be using, please refer to this KingswaySoft documentation for further details on configuring this.

You should now have two connection managers. In this example, I’ve renamed it to DynamicsCRM. Most migrations you create will mostly like just have two data sources, one source and one destination. But there may be cases where you have multiple sources that are then merged together within the Data Flow process before migrating to CRM.

Rob j 7

Next drag and drop the Dynamics CRM Destination components from the SSIS Toolbox onto the Data Flow canvas. Then manually connect the arrow from the OLE DB Source to the new CRM Destination just created. The component has a little red X displayed because it still needs to be configured in the next step.

Rob j 8

Next double click on the Destination component to bring up the configuration screen. Choose the CRM connection manager you just created. For this example, we’re migrating data into accounts, so choose that as the destination entity.

Rob j 9

Next select the Columns tab and map the fields from the source query to the fields in CRM. Using a third party component, like KingswaySoft, it’s that easy and no custom coding is required.

Rob j 10

Press OK and you’re done. You’ve created your first, although very simple, data migration of data outside of CRM into CRM. Now the nice thing about SSIS, it comes with many transformation tools out of box that allow you to cleanup, manipulate, or even de-dupe data. For example, here is a link that is a more complicated example of an SSIS data migration that also de-duplicates data. So from this simple example as a starting point, you can expand it to perform almost any kind of complicated data migration your business may require.

Topics: Integrations

Support: Keep Away from the Runaround, Sue

Today's blog post was written by Jeff Meister, Principal Consultant at Sonoma Partners.

Recently, I've had conversations with several customers around frustrations when it comes to dealing with the various support channels related to their CRM platform. Whether it be core platform support, partner support, or third party support, there always seems to be a great deal of runaround as it relates to getting the basics communicated in order to move on with the real support.

Here at Sonoma Partners, we have the same frustrations, but have worked to formalize a process as it relates to support to ensure all questions are answered upfront to avoid the back and forth. Below is a rough template we follow which covers a lot of the basics and will hopefully help get you to the right people, right away.

SERVICE REALITY CHECK

Additionally, if you feel that the “runaround” is starting, don't be afraid to ask for a phone call. Oftentimes a quick phone call can be much more efficient than going back and forth over email. One other suggestion is to include all parties involved in the communications. If you are working with a partner, you should keep them CC’d throughout the life of the case, as they might have additional input into the issue.

Unfortunately, we can neither confirm nor deny that this will make your support request process more efficiently, but I can tell you that this has improved our process and provided efficiencies in areas where we have historically struggled. Happy supporting!

Topics: CRM Best Practices

How to: Migrating Unified Service Desk Configuration Data

Today's blog post was written by Michael Maloney, Principal Developer at Sonoma Partners.

As with many projects, we typically follow a development, staging, and production model of deployments. On larger projects, it’s not unheard of to have four, five, or even more environments. When it comes to deploying Unified Service Desk, this can be a challenge due to the heavy reliance on data as configuration. Today, we are going to walk through how you can easily migrate this configuration data from one environment to another. For the purposes of this walk-through, we will assume the environment(s) already have the required USD solutions installed. If not, take a look at one of our previous posts on how to get Unified Service Desk up and running.

Before getting started, be sure to download the latest version of the Dynamics CRM and UII SDK from here and extract each to a designated folder, e.g., D365\SDK and D365\UII.

Exporting Unified Service Desk Configuration Data from the Source Environment

To export the configuration data, run the DataMigrationUtility.exe file found in the D365\SDK\Tools\ConfigurationManager folder and choose Export Data on the main screen, then click Continue.

Maloney 1

Enter credentials for the organization you would like to export data from and click Login.

On the next screen, select the default Unified Service Desk configuration data schema file (USDDefaultSchema.xml) to be used for the data export. This is found in the UII\USD Developer Assets\USD Configuration Tool Schema folder.

Specify the name and location of the data file to be exported.

Maloney 2

Click Export Data. The screen displays the export progress and the location of the exported file at the bottom of the screen once the export is complete.

Maloney 3

Click Exit to return to the main menu.

Importing Unified Service Desk Configuration Data to the Target Environment

Before importing the USD configuration data to the target environment, be sure to import the necessary packages and/or solutions first.

From the main screen of the CRM Configuration Manager, select Import Data then click Continue.

Maloney 4

Enter credentials for the organization you would like to export data from and click Login.

The next screen prompts you to provide the data file (.zip) to be imported. Browse to the data file, select it, and then click Import Data.

The next screen displays the import status of your records. The data import is done in multiple passes to first import the foundation data while queuing up the dependent data, and then import the dependent data in the subsequent passes to handle any data dependencies or linkages. This ensures clean and consistent data import.

Maloney 5

Click Exit to close the tool.

To verify the changes in the target environment, open up the Unified Service Desk app and click the “Change Credentials” link on the loading screen.

Maloney 6

If you have more complex customizations involving many solutions and configuration data, you can opt to create a custom package instead. These packages bundle everything up so that you can then run them from the Package Deployer Tool, just as the original Unified Service Desk packages you see when setting up for the first time. We’ve written in the past on how to get started creating your own package, and you can find more detail on MSDN on how to include your configuration data along with the package. 

Dynamics 365: Editable Grids

Topics: Microsoft Dynamics 365

SystemForm with Id Does Not Exist

Today's blog post was written by Matt Dearing, Principal Developer at Sonoma Partners.

I had a customer reach out recently saying they were trying to open contact records from one of their sandbox Dynamics 2016 online instances and were getting the following popup:

Matt dearing 1

The log file showed the following:

"systemform With Id = 04238d8a-dbf8-467c-805f-4af4b757870 Does Not Exist"

I asked the user if they had deleted any forms recently. They said they had deleted a secondary "test" form in that org. My thought was that something had cached that old form id and CRM was continuing to try and load it even though it no longer existed. I asked the user to clear their browser cache, but they still received the same error. I asked them to try and load the same record in a secondary browser while I went ahead and queried "userentityuisettings.lastviewedformxml" via a fetch xml query and noticed that the old form's id was still there.

lastviewedformxml <MRUForm><Form Type="Main" Id="04238d8a-dbf8-467c-805f-4af4b757870f" /></MRUForm>

 

I did a "publish all" and queried again and saw that the correct form id was now stored.

lastviewedformxml <MRUForm><Form Type="Main" Id="1fed44d1-ae68-4a41-bd2b-f13acac4acfa" /></MRUForm>

 

Which meant the publish all may have triggered a refresh, or it was a coincidence and what actually refreshed "lastviewedformxml" was the user's secondary browser. Either way I asked the user to try again in the primary browser, expecting everything to work, but they still received the same error. I navigated to the same record, which loaded fine, so I decided to take a quick look at local storage via the dev tools. I noticed form ids were cached there.

I had the user run "localStorage.clear()" from the console window of the dev tools instance on their primary browser, then reload the page and everything loaded correctly.Although the user had cleared their cache it appears some browsers tie local storage to clearing cookies, so depending on what your cache clear is actually doing, it may not be clearing local storage.

The need for deleting a form rarely arises, but if you find yourself in a similar situation be very careful. If the form must be deleted and users have been using it, you may need them to fully clear their browser cache in order to get the correct form loaded.

Topics: Microsoft Dynamics 365