Almost every CIO I talk with tells me that they face these challenges. They also tell me that dealing with the ever changing compliance requirements adds another layer of difficulty to their jobs. So, what exactly does an IT executive do to tackle these issues to win CIO of the year?
To find out, let’s take a look at the 2013 Denver Business Journal CIO of the year award winner William A. Weeks, CIO SquareTwo Financial. InsideArm published an article about Weeks’ award on July 2, 2013. What did he do so well?
“Bill completely repositioned our IT department as a business differentiator, and increased our technology capabilities so we can lead our industry in data reliability, availability, analysis, decisioning, security and regulatory compliance,” said Paul A. Larkins, president and CEO of SquareTwo Financial.
Selected for the midmarket category of companies with $50 million to $500 million in annual revenue, Weeks stood out for:
Transforming an IT department and plugging IT into the business
Raising the industry bar on data security and collection industry regulatory compliance
Specifically, what Weeks did to accomplish these goals was to improve data quality, increase data and application integration, while improving security and compliance requirements. And at the top of the list is that he “plugged IT into the business”. He aligned the IT group with the business and improved the quality and access of the data assets that the business needs in order to perform more efficiently. That, I believe, is the secret recipe for an award winning CIO.
It is surprising that data quality is still a concept that is viewed as a luxury, rather than a necessity. As an unapologetic data quality advocate, I’ve written white papers and blog posts about the value of good data management. It takes the efforts of many to change habits. In her blog post, The Costs of Poor Data Management, on the Data Integration Blog, Julie Hunt breaks down the impact data quality has on business.
Here’s an infographic on the cost poor data quality can have on business.
She points out that the areas of data quality deserving the greatest focus are specific to each organization. If you read my post, “Avoiding Data Quality Pitfalls”, you know that I’m a proponent of good data governance. Update early and often. My top four suggestions are:
If you haven’t experienced the frustration of trying to wade through duplicate and incorrect data, you’re one of the very few. Dirty data clogs up our databases, integration projects and creates obstacles to getting the information we need from the data. It can be like trying to paddling through a sea of junk.
The value of our data is providing reporting that is accurate and business intelligence that enable good business decisions. Good data governance is critical to successful business as well as meeting compliance requirements.
So how do we avoid the pitfalls of poor data quality?
Perform quality assurance activities for each step of the process. Data quality results from frequent and ongoing efforts to reduce duplication and update information. If that sounds like a daunting task, remember that using the right tools can save substantial time and money, as well as create better results.
Take the time to set clear and consistent rules for setting up your data. If you inherited a database, then you can still update the governance to improve your data quality.
How to update data governance?
Recommendation: Updating data governance will almost always require new code segments being added to existing data import/scrub/validation processes. A side effect of adding new code segments is a “cleanup”. When code is updated to promote data governance, it is usually only applied to new data entering the system. What about the data that was in the system prior to the new data governance code? We want all the new data governance rules to hit new data as well as existing data. You’ll need build the new code segments into separate processes for (hopefully) a one-time cleanup of the existing data. Applying the updated data governance code in conjunction with executing the “cleanup” will bring data governance current, update existing data, and maintain a uniform dataset.
Which are the most important things to update?
Validation Lookups, Tables, and Rules
GIGO – garbage in = garbage out. Rid your data of the garbage early and avoid a massive clean up later. The C-suite appreciates that you’ll run more efficient projects and processes as well.
We all know the kind of profiling that is completely unacceptable and that’s not what I’m talking about here. I neither condone nor practice any kind of socially unacceptable profiling. But there IS one type of profiling that I strongly recommend: Data Profiling. Especially before you migrate your data.
If you think that sounds like a luxury you don’t have the time to fit into your project’s schedule, consider this: Bloor Research conducted a study and found that the data migration projects that used data profiling best practices were significantly more likely (72% compared to 52%) to came in on time and on budget. That’s big difference and there are a lot more benefits organizations realize when they use data profiling in their projects.
Data Profiling enables better regulatory compliance, more efficient master data management and better data governance. It all goes back to the old adage that “You have to measure what you want to manage.” Profiling data is the first step in measuring how good the quality of your data is before you migrate or integrate it. It allows monitoring the quality of the data throughout the life of the data. Data deteriorates at around 1.25-1.5% per month. That adds up to a lot of bad data over the course of a year or two. The lower your data quality is, the lower your process and project efficiencies will be. No one wants that. Download the Bloor Research “Data Profiling – The Business Case” white paper and learn more about the results of this study.
Pervasive has recently developed an effective utility for migrating Data Integrator v9 projects into Pervasive Data Integrator v10. The process is quick and relatively smooth; however, there is the potential for challenges to arise due to the complex nature of most DI projects. If you are thinking about transitioning from v9 to v10, please reach out to Emprise to learn how our team of Certified Pervasive Developers can help your transition to v10 be successful.
Emprise Technologies is proud to be a Platinum sponsor of Pervasive IntegrationWorld 2013. We are also sponsoring the Data Clinic. If you are going to be at IntegrationWorld, come by the Data Clinic and ask one of our Pervasive certified consultants questions about Data Integrator. Bring your toughest Data Integration questions: The Emprise team has collective 30,000 hours of Pervasive work, so we doubt you’ll be able to stump us. But we’re open to your trying! See you at IntegrationWorld 2013. We’ll be in the Hyatt Hill Country Ballroom A-C from 10:15 a.m. until 4:00 p.m. on Monday, April 15 and again on Tuesday, the 16, from 9:20 a.m. – 12:00 p.m.