To keep up with evolving technology and customer behavior, it’s essential for companies to work with partners sharing similar long term approaches to address current and future challenges. Read below to learn about Tealium’s guiding product development beliefs that help ensure new products and features deliver significant value.

This is Part 5 of a 7-part Series. Read other parts: Part 1 | Part 2 | Part 3 | Part 4 | Part 6Part 7

Matt Parisi, Interviewer: (cont’d from Interview Part 4) Core Belief #4, “The amount of data available regarding customers has expanded beyond the ability of human beings to effectively monitor and manage.”

This is gaining some steam in the market, What are some ways Tealium is working to leverage AI? What has come out and what are we working on for the future?

Mike Anderson, Tealium Co-founder and CTO: The volume of data brands manage has exploded over the last 10 years. And the ability to collect that data has exploded too. Customers these days have not only one device, but multiple. From a volume perspective, we handle a lot of transactions— more than Google searches and Youtube videos, Tweets, Amazon purchases, we handle more than all of them combined. We handle more transactions in a day than McDonalds serves individual french fries in a day.

event specifications to maintain data quality as it is collected

Event Specifications feature to maintain data readiness

Marketers have been asking for so many years that they need more data about their customers— and they’ve gotten their wish, tenfold. Now we’re in a situation where there’s just so much data coming in it’s a challenge— people are very active on websites, multiple devices, they’re going into stores, making purchases in stores, etc.

People research a product online, but go into the store to make the purchase and brands want to understand this. Now a brand has the opportunity to get feedback from the customer on each specific experience or touchpoint and re-inspect their own factory and to also engage with the customer to help them accessorize whatever their purchase was. If you, as a brand, know that this particular customer experience is going to happen, you can provide a better, mutually beneficial experience when that customer comes in to accessorize it right on the spot. For example, a customer purchases a couch online, is coming in after 3 to pick it up and I know what that customer was interested in from online behavior and I have that historical behavior at my fingertips to tailor the in-store experience in some way.

It’s a combination that we have a lot of data and data opportunities out there. But at the same time we have these really fleeting moments in time where we can put them to use. That’s the challenging combo. The challenge for marketers and CX professionals is there’s a lot of data to sift through and figure out what’s interesting and what should I ignore. But also be able to recall it and take action on it at the right time. This is where we see AI and ML play a big part of the future of how we use data.

It’s a combination that we have a lot of #data and data opportunities out there. But at the same time we have these really fleeting moments in time where we can put them to use. That's the challenging combo. Click To Tweet

MP: From a product standpoint – how important is it to have tools so that data coming in is high quality and in the right format for downstream use, right at the point of collection?

MA: If I have data coming in that needs to be cleaned up, have lookup values inserted, needs to be processed, needs to be validated, etc, after collection— all that takes time. However, if I put all of those processes right into the data flow, then I don’t slow the data down. Then I have a greater timeframe where I can use that data to generate results. So it’s very important.

We have a huge initiative with our customers, for as long as we’ve been doing this, around data readiness. We’re making sure that the data is good at creation— not that we’ll create bad data and clean it up later down the line (this is what batch processes generally do), which is a common approach. Rather, analyzing data in the stream— garbage in and garbage out. When data is ready to go in the stream it’s easier to analyze, faster, takes less work to “fix” and analyze it.

This comes back to the Universal Data Object (UDO). How you collect data impacts how you can use it. Take a step back, define a UDO that speaks to your business, not to specific vendors you want to use. Then, put that in place and that will be the foundation of your data. Then, we bring that data in; now we know we have good clean data coming in. The idea is that as we collect the data, we’ll put rules and specifications in place to look at it to make sure that data is what I expect to be collecting and streaming and even expect to be influencing ML (machine learning) or automation— that is the data I want to get.

I want to ensure I’m not pushing in order IDs that are undefined, or putting in ‘product quantity’ strings that are empty when they should have numbers in them. That negatively impacts use of the data. Now, the data coming in needs to be clean and has to pass a growing list of validation routines. Beyond that, is then making it accessible (ie: leverage Tealium ML capabilities, send data off of your own data science team, or a combo of two), so the results can come back into the actionability part of the Tealium ecosystem. Data collection is directly tied to data activation.

#Data collection is directly tied to data activation. Click To Tweet

So, we can say we’ve learned X about ‘this date’ and X about ‘this customer’ and make predictions about their behavior based on patterns they’re exhibiting in real time. The goal is to look at this funnel abandonment, marketing re-acquisition— that’s the foundation of email marketing, B2B marketing. The idea is someone showed interest, lost it, and left, but I need to do email, social, ad tech and try to get them to come back and re-engage with me. But, what if based on their behavior as they started going through this engagement funnel, we could predict they would abandon because we have face this all of the time. Put ML on top of this and figure out this person was likely (with significant confidence) that they would abandon. Leverage this opportunity to take advantage in the moment.

People who want to have their business run on data – these things are now becoming a reality based on the computations that we have.

End of Interview Part 5. Read other parts here: Part 1 | Part 2 | Part 3 | Part 4 | Part 6Part 7

This was part 5 of a 7-part interview with Tealium CTO, Mike Anderson, about how Tealium’s core beliefs guide product development. Please check back for future installments of this interview.

Post Author

Matt Parisi
Matt is Director of Product Marketing at Tealium.

Sign Up for Our Blog

By submitting this form, you agree to Tealium's Terms of Use and Privacy Policy.
Back to Blog

Want a CDP that works with your tech stack?

Talk to a CDP expert and see if Tealium is the right fit to help drive ROI for your business.

Get a Demo