Thank you for Subscribing to Life Science Review Weekly Brief
Question…… How many of you at community hospitals have a few million dollars laying around for analytics?
To quote Ben Stein, “Anyone….,Anyone…..”
Didn’t think so….
Another question….. How many of you need to answer basic questions regarding re-admissions, quality metrics, population health, etc.?
So, how do we get these questions answered in the face of budget constraints?
I would like to propose one method.
The basic problem….
Most of the questions that come my way are fairly straightforward. The complexity arises largely because systems don’t talk to each other. Imagine a world where every piece of data was in ONE SYSTEM! How easy would analytics be then! The reality is that we are faced with many systems and sources of data in healthcare and we don’t have an easy way of getting at the data.
Enter the Data Lake
One tool that we use to address data gathering is a Data Lake. I am often asked how a Data Lake and a Data Warehouse differ. A Data Lake is simply a repository where data is stored in its native format. Data Warehouses are defined data structures that model a business area or need. The better, more complete the model, the better the warehouse.
Data Warehouses are at their best when they have an objective. When the business area they deal with is fairly well defined. That isn’t the case in modern healthcare. The questions come hot and heavy from all areas…. The data does like-wise.
Data Warehouses are “Pay Now” systems: You pay upfront to define and optimize the structure, normalize and load the data.
Data Lakes are “Pay Later”: It’s a comparatively straightforward matter to “Take what your given” and put it somewhere. You pay when you need to use the data. You then have to pull what you need, relate it somehow, normalize/ optimize the data, and visualize it.
The idea with the Lake is that you can “mix and match” the data as needed to address the question at hand.
The reasons we use a Data Lake:
1) It stores a large amount of data from a wide variety of sources.
2) The data has no discernable or immediately identifiable keys to match on
3) We lack a platoon of DBAs, Data Modelers, and Business Analysts to define a warehouse.
4) We don’t know what business areas this data will need to address.
Truth be told, I would much prefer a Data Warehouse, but when the data is so varied, the questions are so fluid, and the resources so constrained, the “pay later” approach seems to make the most sense.
Borrowing from Agile
So now that we have the data, how do we best get the answers we seek?
In Scrum (Agile) methodology, you build small working versions of a product at predefined intervals eventually building a complete product. This idea seems perfect for our analytics dilemma!
Why? Agile is concerned with doing things in little pieces…. Complete pieces, but piece-parts non-the-less…. It works in teams of “7 plus or minus 2” and all functions are performed by one team.
So here’s the plan…. Feel free to try it the next time you need a question answered and find yourself without an analytics team!
1) Select a “Scrum Master”. This will be the “point person” who coordinates the process.
2) Clearly define the question. For example, “How many re-admissions of condition X are there?”
3) Have a meeting with business and data people close to the area in question. During this meeting:
a. Define a list of required data points.
b. Select the first three or four data points to gather.
c. Define a time period for the team to come up with these data points. During this time the team will gather the data. Traditionally this “sprint” is defined as 2 two weeks, but it can be any reasonable period that the team is comfortable with.
4) Have a daily 15 minute call to uncover any obstacles. Any issues should be addressed immediately to keep the process moving.
5) At the end of the time period, meet and assign more data gathering if required.
6) Repeat steps 3-5 until enough data is captured to address the question.
7) When enough data is captured, have a final “sprint” to visualize the data into a presentable finding.
Questions that are very targeted work best with this method. Questions that start like: “What’s the number of…” or “How many….” Larger questions that can be broken down into smaller data gathering tasks are good candidates as well.
As more data is created, and demands on organizations increase, the need for flexible approaches to analytics will too. I hope this has given you food for thought about the application of agile techniques for analytics.