Emerging IT Trends – The Age of Data

Big Data is Big Business
Big Data is Big Business

These are truly exciting times! The volume and velocity of data available to every business is astounding and continues to grow. IT industry leaders are talking about where technology is going, what the future holds and the impact all of this will have on the world.

Robin Bloor took a minute to review the path to the present, in his guest post “The Age of Data”, on the Actian blog this week, before revealing the vision he and IT Industry thought leader, Mike Hoskins have of the future for data.

“Mike Hoskins, CTO of Actian (formally, Pervasive Software) suggested to me in a recent conversation that we have entered the Age of Data. Is this the case? ” Bloor begins his post with a review of history. “The dawn of the IT industry could certainly be described as the Age of Iron. Even in the mainframe days there were many hardware companies.” I agree. In the past, the focus was on the machines and what they could do for humans.

Bloor continues, “Despite the fact that computers are only useful if you have applications, the money was made primarily from selling hardware, and the large and growing companies in that Age of Iron made money from the machines.” You can guess the monicker Bloor gives the next phase of IT history: “The Age of Software”. The volume of databases and applications available for organizations to buy exploded. And that got messy. Lots and lots of file types, formats, languages, programs led to multiple versions of records and interoperability nightmares.

What’s next? Bloor suggests it’s the Age of Data. It’s about the data and the analytics it can provide us. This is the Cambrian explosion that will be one of the primary topics discussed at the Big Data & Integration Summit NYC 2013. Actian Chief Technologist, Jim Falgout and I will present our views on emerging trends and lead a roundtable discussion with other industry leaders about the impact all of this will all have on business. I invite you to join what promises to be a lively conversation and attend the Summit.

Based on feedback from  industry leaders and customers, the Emprise Technologies and Actian teams have created a handful of sessions designed to deliver best practices that IT professionals can take home and use immediately to improve IT project success. These include “How to Win Business Buy-in for IT Projects”, “Avoiding the Pitfalls of Data Quality” and “Creating Workflows That Ensure Project Success”. I hope you’ll come join us. If you can’t make to New York, we’re planning to take the Big Data & Integration Summit on the road, so leave us your requested cities and topics in the comments below. We look forward to hearing from you.

The Data Flow Architecture Two-Step

Robin Bloor's Data Two Step
The Two Step Data Process

In his latest post on the Actian corporation hosted Data Integration blog, data management industry analystRobin Bloor laid out his vision of data flow architecture. He wrote, “We organize software within networks of computers to run applications (i.e., provide capability) for the benefit of users and the organization as a whole. Exactly how we do this is determined by the workloads and the service levels we try to meet. Different applications have different workloads. This whole activity is complicated by the fact that, nowadays, most of these applications pass information or even commands to each other. For that reason, even though the computer hardware needed for most applications is not particularly expensive, we cannot build applications within silos in the way that we once did. It’s now about networks and grids of computers.”

Bloor said, “The natural outcome of successfully analyzing a collection of event data is the discovery of actionable knowledge.” He went on to say, “Data analysis is thus a two-step activity. The first step is knowledge discovery, which involves iterative analysis on mountains of data to discover useful knowledge. The second step is knowledge implementation, which may also involve on-going analytical activity on critical data but also involves the implementation of the knowledge.” Read more->

How to Win “CIO of the Year Award”

CIO Award
Winner of the CIO of the Year Award

Do more, with fewer resources and do it faster.

Almost every CIO I talk with tells me that they face these challenges. They also tell me that dealing with the ever changing compliance requirements adds another layer of difficulty to their jobs. So, what exactly does an IT executive do to tackle these issues to win CIO of the year?

To find out, let’s take a look at the 2013 Denver Business Journal CIO of the year award winner William A. Weeks, CIO SquareTwo Financial. InsideArm published an article about Weeks’ award on July 2, 2013. What did he do so well?

“Bill completely repositioned our IT department as a business differentiator, and increased our technology capabilities so we can lead our industry in data reliability, availability, analysis, decisioning, security and regulatory compliance,” said Paul A. Larkins, president and CEO of SquareTwo Financial.

Selected for the midmarket category of companies with $50 million to $500 million in annual revenue, Weeks stood out for:

  • Transforming an IT department and plugging IT into the business
  • Re-engineering and stabilizing legacy systems
  • Reducing costs
  • Delivering numerous automation benefits
  • Raising the industry bar on data security and collection industry regulatory compliance

Specifically, what Weeks did to accomplish these goals was to improve data quality, increase data and application integration, while improving security and compliance requirements. And at the top of the list is that he “plugged IT into the business”. He aligned the IT group with the business and improved the quality and access of the data assets that the business needs in order to perform more efficiently.  That, I believe,  is the secret recipe for an award winning CIO.

“The Cost of Poor Data Management”

It is surprising that data quality is still a concept that is viewed as a luxury, rather than a necessity. As an unapologetic data quality advocate, I’ve written white papers and blog posts about the value of  good data management. It takes the efforts of many to change  habits. In her blog post, The Costs of Poor Data Management, on the Data Integration Blog, Julie Hunt breaks down the impact data quality has on business.

Here’s an infographic on the cost poor data quality can have on business.

Global research - Bad customer data costs you millions

She points out that the areas of data quality deserving the greatest focus are specific to each organization. If you read my post, “Avoiding Data Quality Pitfalls”,  you know that I’m a proponent of good data governance. Update early and often. My top four suggestions are:

  • Translation Tables
  • Stored Procedures
  • Database Views
  • Validation Lookups, Tables, and Rule

What are yours? Read Julie’s post, and send me your comments.