Skip to content

How & Why IT Needs to Dance with Data

BetterCloud

January 30, 2017

6 minute read

dancedata ftr

Not a week seems to go by without some “data” event hitting the headlines.

For example: A theft. An innovation (think artificial intelligence). Something “driven by Big Data.” A regulation or regime defining new rules about its use and protection. Data Scientists have become the de rigueur role, and DBAs and DevOps are moving into another phase of refinement and development.

Behind the scenes, though, there is a group of people trying to make sure the platforms, engines, pipes, and security wrappers are there to make all this data useful, manageable, and governable.

That would be us in IT.

Partners in Time

Data teams and IT teams are the original partners. In the days of mainframes, those defined as guardians of digitized information were called DP (Data Processing). And as the use of this hardware broadened, moving away from purely manipulating data, the focus shifted on the technology. And so the basis of what is now called IT was born.

Data: “The New Oil”

Technology development, in all its streams, is enabling greater capabilities of what we can do with data and how we solve all kinds of problems. For instance, Microsoft has nailed its colors to the mast when it comes to finding a cure for cancer by using the applications and platforms it is developing. Google has for a long time been known as a data company, not an applications one. Free online services have never been “free,” but given in exchange for using your personal data, which on the whole we’ve been happy to agree to.

As such, data is said to be “the new oil.” Data (i.e., data collection and processing) is going to be a huge source of wealth generation of the future.  And the regulatory frameworks around the world are changing as a result. Europe’s GDPR (General Data Protection Regulation), the raft of federal and state regulations in the US, Privacy Shield, T-TIP, and the increasing number of data localization rules cropping up in more defensive nation states all point to a very complex trading environment.

And at the center of all of this, within our companies and organizations, are the teams not only charged with making the most of the opportunities that these new technologies provide, but also doing so safely. If ever there was a need for teams to work more harmoniously, Data and IT have to do so now.

Turning “DevOops” into DevOps

The rise of a new slew of disciplines is proof that progress is being made. Seven years ago I created a position called Hosting Manager, a hybrid role that sat between our data and development teams and my IT team. This was really just a formalization of what was happening anyway. Data and Dev teams needed a hosting environment with particular specifications, and my team had the skills to not only commission those environments but also do it in a way that could be easily fixed when (note I said when, and not if!) those teams broke those environments.

That role is now broadly known as DevOps, and it has an expanded mandate to not just make those hosting environments well managed, but also secure. And in the world of expanding threats, that is a fearsome responsibility.

IT is well placed to manage these new hybrid roles. For years we’ve been protecting our company’s digital assets as guardians of content, networks, and endpoints. Our challenge now is to increase our knowledge of, and integration with, our colleagues on the Data and Dev teams.

As well as helping provide the mechanical support they need, with the best cloud and hybrid environments that integrate with our on-premise IT, we can also help design and implement these in a way that prevents silo’ing, duplication, unnecessary access restrictions, and wasted resources. But we need to do this whilst protecting the data from unsanctioned eyes.

(Where) Are you looking?

The monitoring and security capabilities now being built into cloud and hosted platforms provide a level of automation and intelligence that will be essential to combat the broadening threat landscape. In traditional on-prem security solutions, the onus was on the sysadmins to collect and analyze syslog and event data from firewalls, switches, routers, and servers. Often the resulting data sets were huge, required very expensive applications to manage them, and were very complex to install, manage, and use.

Hand on your heart, how many can say you did this with any real quality and consistency? If one of your users downloaded 20GBs worth of data from a server to their laptop, for example, would you have known about it? Are you actively monitoring or blocking known Tor Network sources and other high threat hosts for unusual traffic to prevent a ransomware attack? Are your high-value assets not only set up with minimum privilege access but also monitored for unusual access requests?

We can provide these capabilities with all the relevant dashboards to surface that information quickly, but only if we are involved in the design and implementation of the services collecting and processing the data we have a responsibility to protect. And we can only do this if we are in bed with those designing those data repositories and applications.

Furthermore, as we sit in the center of most operations in our organizations, we should also be able to share these capabilities with other parts of the business. The Data and Dev departments are not likely to have a mandate, or inclination, to do the same.

Outside In, and Inside In

Before, we adhered to the adage: “Outside bad, inside good.” We can’t make that assumption anymore. The biggest risk factor are our people. User compromise via phishing and horizontal scanning is the quickest and most effective way for someone to harvest valuable data from our systems. And whilst user awareness and education can reduce the risk, what we should also be doing is compartmentalizing our networks so that if one part is compromised, the attacker cannot then move very far within our network.

However, when I’ve tried to do this in the past, we ended up creating as much harm as we protected against by restricting the capability to move data around, or allow cross-border access. We created a very safe network, with highly secure segmentation, making it almost impossible for a lateral breach. What we also did, though, was hamstring a new hybrid team, which was drawn from across business units, from being able to work with other. If we had been talking more closely with our Data and Dev colleagues, as well as senior management, we could have prevented a lot of this pain.

Big Data is Only Getting Bigger

In the realm of Big Data the challenges are increasing. Data sets are of a size that make transportation impractical. So applications, not the data, need to be portable. And if the value of all this data is to be properly realized, then all that data needs to be accessed by many different “customers” in many different ways. These customers can be internal, external, a combination of the two, or something else. Data needs to talk to data, and be queried and referenced by multiple applications from multiple sources, 24×7, all the while retaining its CIA (confidentiality, integrity, availability).

Develop Prescience by Dancing in Step

The challenges are big, but so are the opportunities. But we are only going to realize these opportunities if the main dancers at this party are in step with each other.

We can’t have one doing the tango and the other a waltz. IT needs to know where Data is going to be going next, but Data needs to know what’s possible and how best to get there. And to do this well we need to develop a “prescience” that comes out of working very closely together.

Of all the partnerships within organizations, this should be one of the easier ones. We are both technical disciplines and have an intimate relationship with the technology we wield. And historically, we share a common parent. For the benefit of our organizations, it’s critical that we leverage this commonality.

About the Author

Gavin WhatrupGavin Whatrup (@gwhatrup) started out helping people do innovative things with data. Nearly 30 years later, he’s now helping organizations protect that data, take advantage of cloud-based opportunities, and reimagine the role IT can play in the new age. An early adopter of virtualization and hybrid cloud, Gavin recently managed the migration of his organization to Office 365, across 12 companies and 1,000 users. From small data analysis company, via marketing start-up, media and advertising agencies to marketing communications group, Gavin has tracked the rise and rise of IT as a core corporate function, which at its heart is a people-based service, doing amazing things on a daily basis.