he Big Data & Integration Summit was a success and our presentations are now available to the public for viewing. http://ow.ly/q64hz
These are truly exciting times! The volume and velocity of data available to every business is astounding and continues to grow. IT industry leaders are talking about where technology is going, what the future holds and the impact all of this will have on the world.
Robin Bloor took a minute to review the path to the present, in his guest post “The Age of Data”, on the Actian blog this week, before revealing the vision he and IT Industry thought leader, Mike Hoskins have of the future for data.
“Mike Hoskins, CTO of Actian (formally, Pervasive Software) suggested to me in a recent conversation that we have entered the Age of Data. Is this the case? ” Bloor begins his post with a review of history. “The dawn of the IT industry could certainly be described as the Age of Iron. Even in the mainframe days there were many hardware companies.” I agree. In the past, the focus was on the machines and what they could do for humans.
Bloor continues, “Despite the fact that computers are only useful if you have applications, the money was made primarily from selling hardware, and the large and growing companies in that Age of Iron made money from the machines.” You can guess the monicker Bloor gives the next phase of IT history: “The Age of Software”. The volume of databases and applications available for organizations to buy exploded. And that got messy. Lots and lots of file types, formats, languages, programs led to multiple versions of records and interoperability nightmares.
What’s next? Bloor suggests it’s the Age of Data. It’s about the data and the analytics it can provide us. This is the Cambrian explosion that will be one of the primary topics discussed at the Big Data & Integration Summit NYC 2013. Actian Chief Technologist, Jim Falgout and I will present our views on emerging trends and lead a roundtable discussion with other industry leaders about the impact all of this will all have on business. I invite you to join what promises to be a lively conversation and attend the Summit.
Based on feedback from industry leaders and customers, the Emprise Technologies and Actian teams have created a handful of sessions designed to deliver best practices that IT professionals can take home and use immediately to improve IT project success. These include “How to Win Business Buy-in for IT Projects”, “Avoiding the Pitfalls of Data Quality” and “Creating Workflows That Ensure Project Success”. I hope you’ll come join us. If you can’t make to New York, we’re planning to take the Big Data & Integration Summit on the road, so leave us your requested cities and topics in the comments below. We look forward to hearing from you.
Actian Corporation and Emprise Technologies are co-hosting The Big Data & Integration Summit on September 26, 2013 in NYC and invite CIOs and IT Directors to attend and join in the conversations. #BDISNYC13 This event is free and features a fast-paced agenda that includes these topics and more:
- Win business buy-in for your IT projects
- Specific workflows that ensure project success
- Avoid data quality pitfalls
- Big data and integration trends
Additionally, attendees will join our panel of experts for a round-table discussion on Big Data & Integration challenges facing CIOs now. Talk with Actian Chief Technologist , Jim Falgout, about Hadoop and Big Data Analytics and more.
As CEO of Emprise Technologies, I’ve seen just about every cause there is for integration project failure. Often, there is more than one issue slowing down the project, sometimes a confluence of events – a periodic “perfect storm” develops, which derails integration projects and causes failure. I’m teaming up with Actian’s Chief Technologist, Jim Falgout to share the secrets we’ve learned for ensuring data integration and big data project success.
Don’t miss out on the opportunity to be part of the Big Data & Integration Summit NYC 2013. Register Now! Do you have any topics to suggest for the Summit? Provide us with your comments below. This is YOUR Summit!
Are you the type of person who easily assesses all angles of a decision and calmly arrives at the point of clarity? Or are you the type of person who is overwhelmed by all of the information you need to consider, becoming frozen by indecision, as if you are a deer in the headlights? Does how well you navigate decision-making depend on the type of decision you need to make? Maybe you find making big decisions easy, but smaller ones, like what to order for dinner, leave you stymied.
Effective decision-making requires much more than just the ability to gather and process information. It requires focusing on the very core of the decision, rather than getting mired in the details that can so often derail good decision-making.
Jill Johnson, MBA is an award winning management consultant who has impacted nearly $2.5 billion worth of business decisions and she spoke on this topic at the ACA International’s 74th Annual Convention & Expo in San Diego, CA last week.
What impact does clear decision-making have on companies in the collections business? Let’s start with the decision of which collection software to use. Artiva, DAKCS, CollectOne, Windebt, Titanium ORE (DM9) and FACS are some of the most frequently used credit and collections software used in the industry. Which one is best for your company? Let’s answer that with a question. What is the single most important thing your business needs this software to do? Is it:
2. Process automation
3. Vendor integrations
4. User friendly
What’s key to making the right technology decision, is to focus on the mission critical business outcome.
Once you’ve identified the primary business goal for purchasing collections software, you evaluate each product’s ability to achieve that goal. Software bells and whistles that don’t help your company achieve the primary outcome are extraneous details that should be tossed out. Next, look at other key factors that will affect your company’s ability to execute on your core business. What resources does your company have available to integrate, implement and maintain and the software? Which software syncs most closely with your team’s capabilities?
Your company may have a few other key factors to include in the software selection process. Prioritize them and then score each software solution for effectiveness with those factors.
Finally, there’s budget. It’s last because addressing the primary goal and key factors are mission critical to a clear decision-making process. Without the information about implementation and resources required to maintain the new software, total cost of ownership (TCO) cannot be determined. Quantifying the TCO of software is far more accurate than the purchase price. Focusing on gathering the best information about the primary goals and key factors will provide the path to crystal clear decision-making.
It is surprising that data quality is still a concept that is viewed as a luxury, rather than a necessity. As an unapologetic data quality advocate, I’ve written white papers and blog posts about the value of good data management. It takes the efforts of many to change habits. In her blog post, The Costs of Poor Data Management, on the Data Integration Blog, Julie Hunt breaks down the impact data quality has on business.
Here’s an infographic on the cost poor data quality can have on business.
She points out that the areas of data quality deserving the greatest focus are specific to each organization. If you read my post, “Avoiding Data Quality Pitfalls”, you know that I’m a proponent of good data governance. Update early and often. My top four suggestions are:
- Translation Tables
- Stored Procedures
- Database Views
- Validation Lookups, Tables, and Rule
What are yours? Read Julie’s post, and send me your comments.
“Data is the most valuable asset of any business and is the foundation for building lifetime customer relationships.” Which means that accuracy of the data is mission critical to building strong healthy relationships with customers. Julie Hunt’s blog post on Hub Design Magazine “Pitney Bowes Spectrum Future Proofing” provides keen insight to how a 93-year-old company uses master data management to innovate for the future.
If you haven’t experienced the frustration of trying to wade through duplicate and incorrect data, you’re one of the very few. Dirty data clogs up our databases, integration projects and creates obstacles to getting the information we need from the data. It can be like trying to paddling through a sea of junk.
The value of our data is providing reporting that is accurate and business intelligence that enable good business decisions. Good data governance is critical to successful business as well as meeting compliance requirements.
So how do we avoid the pitfalls of poor data quality?
Perform quality assurance activities for each step of the process. Data quality results from frequent and ongoing efforts to reduce duplication and update information. If that sounds like a daunting task, remember that using the right tools can save substantial time and money, as well as create better results.
Take the time to set clear and consistent rules for setting up your data. If you inherited a database, then you can still update the governance to improve your data quality.
How to update data governance?
Recommendation: Updating data governance will almost always require new code segments being added to existing data import/scrub/validation processes. A side effect of adding new code segments is a “cleanup”. When code is updated to promote data governance, it is usually only applied to new data entering the system. What about the data that was in the system prior to the new data governance code? We want all the new data governance rules to hit new data as well as existing data. You’ll need build the new code segments into separate processes for (hopefully) a one-time cleanup of the existing data. Applying the updated data governance code in conjunction with executing the “cleanup” will bring data governance current, update existing data, and maintain a uniform dataset.
Which are the most important things to update?
- Translation Tables
- Stored Procedures
- Database Views
- Validation Lookups, Tables, and Rules
GIGO – garbage in = garbage out. Rid your data of the garbage early and avoid a massive clean up later. The C-suite appreciates that you’ll run more efficient projects and processes as well.