Loading...
You are here:  Home  >  Data Education  >  Big Data News, Articles, & Education  >  Big Data Articles  >  Current Article

Case Study: Feeding America Takes on Project to Standardize Data and Improve Data Quality

By   /  August 17, 2017  /  No Comments

Feeding Americadata quality is a domestic-hunger relief charity with a nationwide network of close to 200 member food banks that work together to provide food to more than 46 million people through 60,000 food pantries and meal programs. It secures donations from national food and grocery manufacturers, retailers, shippers, packers, growers, government agencies and other organizations. Its staff works with its partners to match excess food to the food banks that need it in all 50 states, the District of Columbia and Puerto Rico, including coordinating logistics and supply chain services to warehouses that handle the distribution to feeding programs.

An organization with such an enormous reach across partners, programs, and recipients (some one in seven Americans are served by it) understandably must deal with significant data processing and Data Quality issues. “It’s pretty obvious this is a complicated chain of operations and data and information flowing through at every level,” according to Theresa DelVecchio Dys, Director of Social Policy Research and Analysis at Feeding America. A current focal point is trying to determine and deliver a data technology component related to its clients, including collecting their data on an ongoing basis and in a consistent way.

Along with garnering the feedback and input of clients, the aim is to develop a standardized approach across food banks and their agencies to register the individuals they service and to keep that information updated. That way, these organizations can help make sure individuals are correctly serviced in the communities in which they live, and that food resources will be appropriately allocated to them so that they can satisfy all their clients’ needs.

“We want to figure out how to get 60,000 programs to use similar technology to collect similar types of data from clients and [create processes] for how that flows back to food banks and then back to the national office,” DelVecchio Dys explains.

It’s important to have that data on a regular basis, far faster than what research studies can provide. But today the data coming in from pantries and meal programs to food banks can be delivered in a variety of formats, from paper to Microsoft Excel spreadsheets to web-based technology. That inconsistency creates a drag on all the data processes and Data Quality that follow to get food where it’s needed.

Starting the Data Quality Journey

Data Blueprint is the Data Management consulting firm that Feeding America is working with to move this effort forward. “At Feeding America, they are in the process of improving the business and thinking about data first,” says Micah Dalton, COO at Data Blueprint.
“There are tons of challenges in moving the mindset of an organization that wants that and is trying to get there.”

For example, many of the pantries, soup kitchens and other meal programs that directly service clients are run by older adult volunteers as part of faith-based programs, and they’re often not particularly familiar with technology. Those services that leverage database solutions like Microsoft Access, for instance, are considered to be among the more advanced, but there are plenty of instances where volunteers find it complicated to even work through Excel, he explains.

Manual intake processes that clients go through before getting food the first time, he notes, do their part to create inefficiencies. When that information is recorded on paper, getting it into electronic shape for reporting to food bank partners involves aggregating information from multiple stacks of paper, essentially duplicating efforts and leaving lots of room for error, too.

“There’s a need for Data Quality right in the process,” Dalton says. There’s also a huge opportunity when no formalized processes exist to have the right technology framework step in:

“Address that problem and standardize the way agencies can recognize clients similarly so that they can have an understanding of the people they are serving, what their needs are, and how they change.”

Why Quality Data Counts

DelVecchio Dys expands on the point and the relationship between Data Quality and winning the fight to end hunger. “We need quality data to tell us what the issues are and where the problems are, and that’s why we think this program and project is important,” she says. Donors that can supply funding want that kind of data. So, do legislators that Feeding America must lobby:

“On the hill legislators want to know the constituents in their district that are food insecure, their race, ethnicity gender, age, and the number of kids that are experiencing hunger. We can’t do that without having this quality data,” she says.

Research studies that can provide some of that information take too long to do on a regular basis, but having it roll up from the sites where the services are deployed means “we can better communicate our message to legislators, donors, and the general public,” she says. It’s a massive transformation to go from leveraging research-focused data about food and hunger issues from a year ago, to what is happening now across the country from a hunger standpoint, Dalton adds, “And how to leverage the network of food agencies to understand the here and now and show that impact on a real-time basis.”

Additionally, such data can be leveraged to support intervention programs for specific populations, like children. “We need data to design those interventions and to see that they are doing what we intend them to do,” she says. Feeding America can’t do that without quality data on ongoing basis, she says.

Work in Progress

The transformational nature of the program, Dalton says, requires involving a broad set of stakeholders within the framework of the effort to implement a client data tracking solution. That required engaging with member food banks and pantries to understand their current state and processes, what their data looks like, and what technology they currently used.

One delivery, he says, is to create a conceptual data model and business glossary to consistently define an understanding of the client. That needs to happen both in relation to the individual’s core elements (for example, what is meant by “household member”), but also tangential elements that Feeding America might want to collect in the future with reliability and consistency. “Programs vary dramatically and are extremely diverse,” says DelVecchio Dys. “So when we think of developing a framework to support our members it must fit all their needs.” They’re getting feedback and sharing it among parties all the time as an iterative process.

Having an extended and equally standardized business glossary should reap big benefits. “That helps Feeding America move from a reactive understanding of what clients do to proactive data to build a profile,” Dalton says, which can encompass things like the frequency of client visits, specific times of months or even days when demand is higher to facilitate planning for volunteers at pantries, and individual food preferences. For example, if the data can show which type of food is often chosen by individuals who visit certain client choice model pantries, which operate like small scale grocery stores, that can afford Feeding America’s national office insight about seeking more of those items at discount or donation to send to those particular sites, DelVecchio Dys says.  “We can have a line of site to incorporate that,” Dalton adds.

A pilot is currently operating with five food banks, with plans to run through October before developing a final framework based on consolidated feedback by January 2018. Each food bank is attempting to get 20 to 50 of their feeding program partners to leverage the same operational processes to collect the same data points.

The pilot is being used to understand the depth of knowledge each organization has, how quickly they can adapt to moving off paper to automated environments, and what data they are capable of collecting with some level of standardization that can feed the final framework. Data Blueprint is providing recommendations for updates and modifications to technologies so that they will be consistent with the final framework.

It’s complicated not only by some difficulties in getting volunteers to adapt to new technology solutions but also by the fact that different pantries may prefer different types of devices (tablets vs laptops, for instance), and some pantries may not even have Internet connections. That means the system may have to incorporate offline collection capabilities with online delivery from another point. “A key piece of data for food banks to use is a technology buying guide as a jumpstart to the procurement process for partners so they can figure out what will best suit their needs,” Dalton says. Additionally, processes will have to be put in place so that volunteers ensure that clients don’t feel threatened by someone entering their information into a computer.

According to Dalton, one thing that has come to light through the pilot is that it’s important to provide enough detail to offer a data framework that agencies can follow, but not so much that you’ve pinned them into a place that doesn’t work for their environment. “We’re getting information from the pilots to try to right-size this throughout,” he says.

Photo Credit: Discha-AS/Shutterstock.com

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Data Science Use Cases

Read More →