The Big Data & Integration Summit was a Success

he Big Data & Integration Summit was a success and our presentations are now available to the public for viewing. http://ow.ly/q64hz

Advertisements

Emerging IT Trends – The Age of Data

Big Data is Big Business
Big Data is Big Business

These are truly exciting times! The volume and velocity of data available to every business is astounding and continues to grow. IT industry leaders are talking about where technology is going, what the future holds and the impact all of this will have on the world.

Robin Bloor took a minute to review the path to the present, in his guest post “The Age of Data”, on the Actian blog this week, before revealing the vision he and IT Industry thought leader, Mike Hoskins have of the future for data.

“Mike Hoskins, CTO of Actian (formally, Pervasive Software) suggested to me in a recent conversation that we have entered the Age of Data. Is this the case? ” Bloor begins his post with a review of history. “The dawn of the IT industry could certainly be described as the Age of Iron. Even in the mainframe days there were many hardware companies.” I agree. In the past, the focus was on the machines and what they could do for humans.

Bloor continues, “Despite the fact that computers are only useful if you have applications, the money was made primarily from selling hardware, and the large and growing companies in that Age of Iron made money from the machines.” You can guess the monicker Bloor gives the next phase of IT history: “The Age of Software”. The volume of databases and applications available for organizations to buy exploded. And that got messy. Lots and lots of file types, formats, languages, programs led to multiple versions of records and interoperability nightmares.

What’s next? Bloor suggests it’s the Age of Data. It’s about the data and the analytics it can provide us. This is the Cambrian explosion that will be one of the primary topics discussed at the Big Data & Integration Summit NYC 2013. Actian Chief Technologist, Jim Falgout and I will present our views on emerging trends and lead a roundtable discussion with other industry leaders about the impact all of this will all have on business. I invite you to join what promises to be a lively conversation and attend the Summit.

Based on feedback from  industry leaders and customers, the Emprise Technologies and Actian teams have created a handful of sessions designed to deliver best practices that IT professionals can take home and use immediately to improve IT project success. These include “How to Win Business Buy-in for IT Projects”, “Avoiding the Pitfalls of Data Quality” and “Creating Workflows That Ensure Project Success”. I hope you’ll come join us. If you can’t make to New York, we’re planning to take the Big Data & Integration Summit on the road, so leave us your requested cities and topics in the comments below. We look forward to hearing from you.

How to Win “CIO of the Year Award”

CIO Award
Winner of the CIO of the Year Award

Do more, with fewer resources and do it faster.

Almost every CIO I talk with tells me that they face these challenges. They also tell me that dealing with the ever changing compliance requirements adds another layer of difficulty to their jobs. So, what exactly does an IT executive do to tackle these issues to win CIO of the year?

To find out, let’s take a look at the 2013 Denver Business Journal CIO of the year award winner William A. Weeks, CIO SquareTwo Financial. InsideArm published an article about Weeks’ award on July 2, 2013. What did he do so well?

“Bill completely repositioned our IT department as a business differentiator, and increased our technology capabilities so we can lead our industry in data reliability, availability, analysis, decisioning, security and regulatory compliance,” said Paul A. Larkins, president and CEO of SquareTwo Financial.

Selected for the midmarket category of companies with $50 million to $500 million in annual revenue, Weeks stood out for:

  • Transforming an IT department and plugging IT into the business
  • Re-engineering and stabilizing legacy systems
  • Reducing costs
  • Delivering numerous automation benefits
  • Raising the industry bar on data security and collection industry regulatory compliance

Specifically, what Weeks did to accomplish these goals was to improve data quality, increase data and application integration, while improving security and compliance requirements. And at the top of the list is that he “plugged IT into the business”. He aligned the IT group with the business and improved the quality and access of the data assets that the business needs in order to perform more efficiently.  That, I believe,  is the secret recipe for an award winning CIO.

“The Cost of Poor Data Management”

It is surprising that data quality is still a concept that is viewed as a luxury, rather than a necessity. As an unapologetic data quality advocate, I’ve written white papers and blog posts about the value of  good data management. It takes the efforts of many to change  habits. In her blog post, The Costs of Poor Data Management, on the Data Integration Blog, Julie Hunt breaks down the impact data quality has on business.

Here’s an infographic on the cost poor data quality can have on business.

Global research - Bad customer data costs you millions

She points out that the areas of data quality deserving the greatest focus are specific to each organization. If you read my post, “Avoiding Data Quality Pitfalls”,  you know that I’m a proponent of good data governance. Update early and often. My top four suggestions are:

  • Translation Tables
  • Stored Procedures
  • Database Views
  • Validation Lookups, Tables, and Rule

What are yours? Read Julie’s post, and send me your comments.

Pitney Bowes Spectrum: Future-Proofing MDM, by Julie Hunt

“Data is the most valuable asset of any business and is the foundation for building lifetime customer relationships.” Which means that accuracy of the data is mission critical to building strong healthy relationships with customers. Julie Hunt’s blog post on Hub Design Magazine  “Pitney Bowes Spectrum Future Proofing” provides keen insight to how a 93-year-old company uses master data management to innovate for the future.

 

Hub Designs Magazine

A briefing by Pitney Bowes Software for the Hub Designs MDM Think Tank

View original post 1,688 more words

Avoid Data Quality Pitfalls

If you haven’t experienced the frustration of trying to wade through duplicate and incorrect data, you’re one of the very few. Dirty data clogs up our databases, integration projects and creates obstacles to getting the information we need from the data. It can be like trying to paddling through a sea of junk.

The value of our data is providing reporting that is accurate and business intelligence that enable good business decisions. Good data governance is critical to successful business as well as meeting compliance requirements.

Image

So how do we avoid the pitfalls of poor data quality?

Perform quality assurance activities for each step of the process. Data quality results from frequent and ongoing efforts to reduce duplication and update information. If that sounds like a daunting task, remember that using the right tools can save substantial time and money, as well as create better results.

Take the time to set clear and consistent rules for setting up your data. If you inherited a database, then you can still update the governance to improve your data quality.

How to update data governance?

Recommendation: Updating data governance will almost always require new code segments being added to existing data import/scrub/validation processes.  A side effect of adding new code segments is a “cleanup”.  When code is updated to promote data governance, it is usually only applied to new data entering the system.  What about the data that was in the system prior to the new data governance code?  We want all the new data governance rules to hit new data as well as existing data.  You’ll need build the new code segments into separate processes for (hopefully) a one-time cleanup of the existing data.  Applying the updated data governance code in conjunction with executing the “cleanup” will bring data governance current, update existing data, and maintain a uniform dataset.

Which are the most important things to update?

  • Translation Tables
  • Stored Procedures
  • Database Views
  • Validation Lookups, Tables, and Rules

GIGO – garbage in = garbage out. Rid your data of the garbage early and avoid a massive clean up later. The C-suite appreciates that you’ll run more efficient projects and processes as well.

When Profiling Is A Good Thing

We all know the kind of profiling that is completely unacceptable and that’s not what I’m talking about here. I neither condone nor practice any kind of socially unacceptable profiling. But there IS one type of profiling that I strongly recommend: Data Profiling. Especially before you migrate your data.

If you think that sounds like a luxury you don’t have the time to fit into your project’s schedule, consider this: Bloor Research conducted a study and found that the data migration projects that used data profiling best practices were significantly more likely (72% compared to 52%) to came in on time and on budget. That’s big difference and there are a lot more benefits organizations realize when they use data profiling in their projects.

Data Profiling enables better regulatory compliance, more efficient master data management and better data governance. It all goes back to the old adage that “You have to measure what you want to manage.” Profiling data is the first step in measuring how good the quality of your data is before you migrate or integrate it. It allows monitoring the quality of the data throughout the life of the data. Data deteriorates at around 1.25-1.5% per month. That adds up to a lot of bad data over the course of a year or two. The lower your data quality is, the lower your process and project efficiencies will be. No one wants that. Download the Bloor Research “Data Profiling – The Business Case” white paper and learn more about the results of this study.

White Paper Download