Setting up a scalable data strategy early and sticking to it is critical for the success of your analysis efforts. Without high-quality data, delivering high-fidelity insight is an insurmountable task for any analytics desk.
This means having a road map and processes in place to accomplish the following goals:
An effective data architecture gives analysts access to detailed information on your organization’s revenue (e.g. sales) and expenditures, external market data, and the flexibility to intertwine these data sources. ERPs have long been in place in most large companies, and these systems are important for regulatory compliance and financial operations. However, they are nearly uniformly unfit for analysis beyond simple reporting. The creation, maintenance, and regular updating of a unified data warehouse is paramount and a prerequisite to a fully-operational analytics team.
Importantly, the creation of your data systems is not something to outsource. I repeat: This is not the step to outsource.
The people who are involved in the outlining of your data roadmap should also implement it. Building institutional knowledge about your data, the processes by which it is updated, audited, and expanded is essential. Without this knowledgebase, your organization will be on the path to redundant work, organizationally fractious data hoarding, and localized, non-scalable solutions to enterprise-wide issues.
Here’s the part where many would-be organizational success stories end: the group that sets up and maintains these data systems must not be an island, silo, or a vertical. People from this group should be on project teams with analysts. They know what data exists, where it is, when it gets updated, and how it should be accessed. Without access to your data gurus, you’re paying analysts to reverse-engineer work you’ve already paid for once!
In order to execute on quantitative results, an analytics group requires DBAs, developers, and cross-functional specialists who are able to embed in project groups with your analysis team. Depending on the size of your organization, you may only have 3-4 people in IT, but it is key to have one of these folks available to your analysts. The good news is that, as a project matures, the IT/data guru role will shrink rapidly as analysts learn the relevant systems and how to properly set up pipelines to the data they need.
This may be different from how project groups are set up in your organization today, but take a look at what this approach yields:
Not a bad return for a simple variation on tasks you need to do anyway.
As you've noticed, this analytics deal is a serious investment. It requires flexible, dynamic people who bring a lot to the table and are still hungry to learn. You need to be willing to facilitate a new approach to work that is not the norm, and you'll need to accept a learning curve on both the IT and analysis side of the equation.
There's an easily underestimated amount of discipline involved in setting up your group for success: some tasks may seem trivial and easy to farm out of your group or company, but outsourced, black-box solutions always grow stale, and inevitably need to be changed or upgraded when you scale up or reorganize. At that point, outsourced solutions will need to be taken apart and re-engineered by people who will already have 10 hours of work to do each day.
Handling these foundational tasks internally today, is an insurance policy against potentially huge costs in the future.
This means having a road map and processes in place to accomplish the following goals:
- Capture Incoming Data
- Prices from suppliers, data from web services, etc.
- Utilize Internal Data
- Unfettered access to data warehouses that harmonize ERP data, quality data, etc.
- Capture Outgoing Data
- Sales information, payment terms, volumes, contracts, etc.
An effective data architecture gives analysts access to detailed information on your organization’s revenue (e.g. sales) and expenditures, external market data, and the flexibility to intertwine these data sources. ERPs have long been in place in most large companies, and these systems are important for regulatory compliance and financial operations. However, they are nearly uniformly unfit for analysis beyond simple reporting. The creation, maintenance, and regular updating of a unified data warehouse is paramount and a prerequisite to a fully-operational analytics team.
Importantly, the creation of your data systems is not something to outsource. I repeat: This is not the step to outsource.
The people who are involved in the outlining of your data roadmap should also implement it. Building institutional knowledge about your data, the processes by which it is updated, audited, and expanded is essential. Without this knowledgebase, your organization will be on the path to redundant work, organizationally fractious data hoarding, and localized, non-scalable solutions to enterprise-wide issues.
Here’s the part where many would-be organizational success stories end: the group that sets up and maintains these data systems must not be an island, silo, or a vertical. People from this group should be on project teams with analysts. They know what data exists, where it is, when it gets updated, and how it should be accessed. Without access to your data gurus, you’re paying analysts to reverse-engineer work you’ve already paid for once!
In order to execute on quantitative results, an analytics group requires DBAs, developers, and cross-functional specialists who are able to embed in project groups with your analysis team. Depending on the size of your organization, you may only have 3-4 people in IT, but it is key to have one of these folks available to your analysts. The good news is that, as a project matures, the IT/data guru role will shrink rapidly as analysts learn the relevant systems and how to properly set up pipelines to the data they need.
This may be different from how project groups are set up in your organization today, but take a look at what this approach yields:
- Analysts rapidly acquiring and disseminating expertise on your data systems, increasing their own effectiveness within the organization with no explicit training
- An IT function that understands precisely how its data is being utilized
- IT no longer needs to work from assumptions, rapidly-aging scoping documents, or knowledge individuals brought with them from other organizations about systems you have in place; they have it straight from the horse's mouth
- Data transparency to analysts and utilization transparency to IT mitigates dependency on individuals
- This way, you avoid the position of having your star player poached and the rest of your team crippled as a result
- Working together creates cohesion among your functions, facilitating an ownership mentality that cannot be matched by the traditional here's-your-ticket service model
- You can't replace a behemoth company's IT structure, but by bringing in some dedicated individuals for your group, you'll see a massive increase in effectiveness
Not a bad return for a simple variation on tasks you need to do anyway.
As you've noticed, this analytics deal is a serious investment. It requires flexible, dynamic people who bring a lot to the table and are still hungry to learn. You need to be willing to facilitate a new approach to work that is not the norm, and you'll need to accept a learning curve on both the IT and analysis side of the equation.
There's an easily underestimated amount of discipline involved in setting up your group for success: some tasks may seem trivial and easy to farm out of your group or company, but outsourced, black-box solutions always grow stale, and inevitably need to be changed or upgraded when you scale up or reorganize. At that point, outsourced solutions will need to be taken apart and re-engineered by people who will already have 10 hours of work to do each day.
Handling these foundational tasks internally today, is an insurance policy against potentially huge costs in the future.
Copyright © 2018