he Big Data & Integration Summit was a success and our presentations are now available to the public for viewing. http://ow.ly/q64hz
Are you the type of person who easily assesses all angles of a decision and calmly arrives at the point of clarity? Or are you the type of person who is overwhelmed by all of the information you need to consider, becoming frozen by indecision, as if you are a deer in the headlights? Does how well you navigate decision-making depend on the type of decision you need to make? Maybe you find making big decisions easy, but smaller ones, like what to order for dinner, leave you stymied.
Effective decision-making requires much more than just the ability to gather and process information. It requires focusing on the very core of the decision, rather than getting mired in the details that can so often derail good decision-making.
Jill Johnson, MBA is an award winning management consultant who has impacted nearly $2.5 billion worth of business decisions and she spoke on this topic at the ACA International’s 74th Annual Convention & Expo in San Diego, CA last week.
What impact does clear decision-making have on companies in the collections business? Let’s start with the decision of which collection software to use. Artiva, DAKCS, CollectOne, Windebt, Titanium ORE (DM9) and FACS are some of the most frequently used credit and collections software used in the industry. Which one is best for your company? Let’s answer that with a question. What is the single most important thing your business needs this software to do? Is it:
2. Process automation
3. Vendor integrations
4. User friendly
What’s key to making the right technology decision, is to focus on the mission critical business outcome.
Once you’ve identified the primary business goal for purchasing collections software, you evaluate each product’s ability to achieve that goal. Software bells and whistles that don’t help your company achieve the primary outcome are extraneous details that should be tossed out. Next, look at other key factors that will affect your company’s ability to execute on your core business. What resources does your company have available to integrate, implement and maintain and the software? Which software syncs most closely with your team’s capabilities?
Your company may have a few other key factors to include in the software selection process. Prioritize them and then score each software solution for effectiveness with those factors.
Finally, there’s budget. It’s last because addressing the primary goal and key factors are mission critical to a clear decision-making process. Without the information about implementation and resources required to maintain the new software, total cost of ownership (TCO) cannot be determined. Quantifying the TCO of software is far more accurate than the purchase price. Focusing on gathering the best information about the primary goals and key factors will provide the path to crystal clear decision-making.
Do more, with fewer resources and do it faster.
Almost every CIO I talk with tells me that they face these challenges. They also tell me that dealing with the ever changing compliance requirements adds another layer of difficulty to their jobs. So, what exactly does an IT executive do to tackle these issues to win CIO of the year?
To find out, let’s take a look at the 2013 Denver Business Journal CIO of the year award winner William A. Weeks, CIO SquareTwo Financial. InsideArm published an article about Weeks’ award on July 2, 2013. What did he do so well?
“Bill completely repositioned our IT department as a business differentiator, and increased our technology capabilities so we can lead our industry in data reliability, availability, analysis, decisioning, security and regulatory compliance,” said Paul A. Larkins, president and CEO of SquareTwo Financial.
Selected for the midmarket category of companies with $50 million to $500 million in annual revenue, Weeks stood out for:
- Transforming an IT department and plugging IT into the business
- Re-engineering and stabilizing legacy systems
- Reducing costs
- Delivering numerous automation benefits
- Raising the industry bar on data security and collection industry regulatory compliance
Specifically, what Weeks did to accomplish these goals was to improve data quality, increase data and application integration, while improving security and compliance requirements. And at the top of the list is that he “plugged IT into the business”. He aligned the IT group with the business and improved the quality and access of the data assets that the business needs in order to perform more efficiently. That, I believe, is the secret recipe for an award winning CIO.
“Data is the most valuable asset of any business and is the foundation for building lifetime customer relationships.” Which means that accuracy of the data is mission critical to building strong healthy relationships with customers. Julie Hunt’s blog post on Hub Design Magazine “Pitney Bowes Spectrum Future Proofing” provides keen insight to how a 93-year-old company uses master data management to innovate for the future.
We all know the kind of profiling that is completely unacceptable and that’s not what I’m talking about here. I neither condone nor practice any kind of socially unacceptable profiling. But there IS one type of profiling that I strongly recommend: Data Profiling. Especially before you migrate your data.
If you think that sounds like a luxury you don’t have the time to fit into your project’s schedule, consider this: Bloor Research conducted a study and found that the data migration projects that used data profiling best practices were significantly more likely (72% compared to 52%) to came in on time and on budget. That’s big difference and there are a lot more benefits organizations realize when they use data profiling in their projects.
Data Profiling enables better regulatory compliance, more efficient master data management and better data governance. It all goes back to the old adage that “You have to measure what you want to manage.” Profiling data is the first step in measuring how good the quality of your data is before you migrate or integrate it. It allows monitoring the quality of the data throughout the life of the data. Data deteriorates at around 1.25-1.5% per month. That adds up to a lot of bad data over the course of a year or two. The lower your data quality is, the lower your process and project efficiencies will be. No one wants that. Download the Bloor Research “Data Profiling – The Business Case” white paper and learn more about the results of this study.
Pervasive Data Integrator can be a powerful tool, enabling multiple connections between a wide variety of systems and data points. The Repository Explorer is the starting point for every Data Integrator project. To get started on your first project, you must understand how to configure your Workspaces and Repositories through Repository Explorer.
What is the Repository Explorer?
The Repository Explorer is the starting point for all Pervasive Data Integrator projects. From this one application, developers can navigate to projects, open existing Pervasive Data Integrator elements (Maps, Processes, Structured Schemas, etc), or create new instances of those elements.
Who uses the Repository Explorer?
The Repository Explorer is used almost exclusively by developers, but it can also be used by quality assurance resources to access and review code that has already been developed.
How to Configure Repository Explorer?
Opening Repository Explorer
Once installed, Repository Explorer can be accessed like any other program on Windows.
- Open the Start menu
- Select ‘All Programs’
- Select Pervasive folder
- Select Data Integrator 9 folder
- Select Repository Explorer 9 program
Setting up a Workspace
The Repository Explorer organizes files using two methods. The first method is via a Workspace. A Workspace is a collection of one ore many repositories and a single Macrodef.xml file that is specific to the Workspace. (Note: For further information on the Macrodef.xml file, check out our two videos on Macro Definition Variables). At Emprise, our best practice is to create one Workspace for each project. This allows us to specific a unique Macrodef file for each project.
To create a new Workspace, one just has to follow a few simple steps.
- Select File from the menu bar
- Select Manage Workspaces…
- Click the drop down to the right of Workspaces Root Directory and navigate to the location you would like to save your workspace in. Hit OK. The Workspace Root Directory is the location where the Workspace folder will be created. Inside of this folder a set of mandatory, default files and folders will be created.
- Xmldb – This is the folder used for the default repository when creating a new Workspace.
- Fileref.xml – A list of file references used by the workspace.
- Macrodef.xml – The macrodef file. For further information see our video.
- Repositores.xml – A list of all repositories for the Workspace. This directory is rarely changed after initial setup.
- Select Add
- To add an existing workspace, check the box for the proper workspace.
- If adding a new Workspace, which will often be the case, click the ‘Create New Workspace” button. You will be prompted for a name. Give your workspace a descriptive name and then click “OK” to return to the previous screen.
- Click OK
- Find the Workspace you just added, click the name to highlight it, then click the Set Current button on the right. This activates the Workspace, allowing you access to its repositories.
*Note: You can also right click on the white space to the left of the screen that displays your current repository and select the ‘Manage Workspaces’ option from there. Also, when creating a new Workspace using the “Create New Workspace” option, the Macrodef.xml file from the current Workspace will be copied into the directory of the new Workspace. This includes the Macro names and values. This is helpful when standard Macros are used, but is something to pay attention to in regards to directory paths and file names.
Setting up a Repository
Repositories are directory paths pointing to where the Pervasive DI project files will be located. A Workspace can have any number of Repositories, which are displayed in a tree view on the left side of the Repository Explorer. Only Pervasive files located in one of the Repositories for the Workspace can be opened and edited from the Repository Explorer.
- If you are not currently working in the workspace within which you want to create a repository, navigate to that workspace using File -> Manage Workspaces or right click on the white space on the left that displays the file directory structure and select Manage Workspaces. Then, click the text of the Workspace you would like to use to highlight it before clicking the Set Current button on the right.
- Select File form the menu bar
- Select Manage Repositories…
- Click the Add… button to create a new repository. At this point you can either navigate to the folder you would like to select as the Repository or paste in the file path as copied from Windows Explorer.
Note: You can also right click on the white space to the left of the screen that displays your current repository and select the ‘Manage Repositories’ option from there.
- Use a standard naming convention for the Workspaces. This allows for easy identification of what project the Workspace is for.
- Use a standard data structure and directory path for the Repository folder. This prevents issues that may arise when multiple developers work on the same code base. Pervasive Data Integrator projects use a series of pointers within the files and by standardizing the repository paths you prevent these pointers from being corrupted when moving code between developers.
- After creating a new Workspace it is prudent to open the Macrodef.XML file and remove unneeded Macros and change others to match your new project. Please see our video on the MacroDef for further information.