Episodes
Friday Aug 07, 2020
Friday Aug 07, 2020
The quality of data collected is often an afterthought until there is a problem and at that point it’s too late.
On this week’s episode of 33 Tangents, Jim and Jason are joined by Eric Richter, a strategic consultant and data integrity expert. Eric also likes to go by the title of “Chief Data Exorcist”.
Eric focuses on continual improvement of data integrity processes and test strategies. Eric is the Founder and Chief Data Exorcist of Data Quake, and before that he was the Director of Data Quality Solutions for Keystone Solutions
Items that we mentioned:
- ObservePoint
- Data True
- Charles Proxy
- Fiddler
- dataslayer
- Adobe Experience Cloud Debugger
- ObservePoint Tag Debugger
THANK YOU
We know your time is limited, so it means a lot to us that you would spend some of your time with us. If you have found this episode to be valuable, we would appreciate if you would share using one of the social media buttons bellow
And if we are getting you hooked, don’t forget to subscribe, like, and recommend on your favorite podcast platform.
The 33 Tangents video simulcast is now available on YouTube
Subscribe on Apple Podcasts: https://podcasts.apple.com/us/podcast/33-tangents/id1384329330
Listen on TuneIn: https://tunein.com/podcasts/Technology-Podcasts/33-Tangents-p1129251/
WHERE TO FIND US
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
YouTube: https://www.youtube.com/channel/UC8KUpp_LygXotCrKgR9ZoBg
Friday Mar 15, 2019
Friday Mar 15, 2019
With March Madness right around the corner, you’re going to see many spending a great deal of time putting their tournament brackets together. Why does if feel like some of these people spend more time and effort on that than their testing program? We’ve come across many who don’t take their testing program seriously and just slap stuff together. There’s no planning, there’s no deep analysis, and the results show.
What do you do to convince people that to get the most out of a testing program, you just can fly by the seat of your pants and see what happens? Aren’t testing programs supposed to let you move rapidly so you can learn?
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Jan 18, 2019
Friday Jan 18, 2019
Tag Management Systems have greatly improved the flexibility and responsiveness of implementations in recent years. There are many positives to them but this episode is not about that. Governance is often seen as an extra step or not critical so many companies simply jump in and start coding without any design or plan.
What are the problems that can arise if you don’t manage and maintain your TMS properly? What are the components of a proper TMS governance plan and maintenance strategy? In this week’s episode, Jenn, Jason, and Jim focus on the need to have proper governance in place in order to prevent issues and to get the most out of your TMS implementation.
WHAT YOU’LL HEAR…
🔊3:00 What are the issues that can arise if you don't have TMS governance in place?
🔊10:50 Jason talks about why he enjoys Zen and the Art of Motorcycle Maintenance, and maintenance in general, so much
🔊14:30 Adam Savage and the art of knolling
🔊16:30 Avoid the trap of just jumping in and starting the build
🔊19:20 Recommendations for starting a tag cleanup activity
🔊26:30 The importance of having strong analytics leadership
🔊33:50 Injecting analytics governance into your project workflow
THANK YOU
We know your time is limited, so it means a lot to us that you would spend some of your time with us. If you have found this episode to be valuable, we would appreciate if you would share using one of the social media buttons bellow
And if we are getting you hooked, don’t forget to subscribe, like, and recommend on your favorite podcast platform.
Subscribe on iTunes: https://itunes.apple.com/us/podcast/33-tangents/id1384329330
Listen on TuneIn: https://tunein.com/podcasts/Technology-Podcasts/33-Tangents-p1129251/
WHERE TO FIND US
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Dec 21, 2018
Friday Dec 21, 2018
We’ve gotten this question, in one form or another, many many times: I’m looking at system A and system B and they don’t match, how do we fix that? To validate that a system is collecting data correctly you need to compare it to another system. For example you would compare your Analytics implementation to your backend order management system to ensure that you’re conversion tracking is accurate. However these two systems will never match exactly and if you’re not careful, your validation can spiral out of control. If the wrong audience is involved, they won’t trust any of the data so this process has to be managed very carefully. What are some reasons systems will never match exactly? If they don’t match exactly, what is the point of doing this validation? How do you manage expectation, both yours and others? How do you prevent it from spiraling out of control? This week Jenn, Jason, and Jim discuss their experience with reconciliation efforts, how they can spawn more problems than they solve, and how to properly manage them.
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Nov 16, 2018
Friday Nov 16, 2018
This week Jim was in Boston and paid a visit to The Christian Science Monitor, one of 33 Sticks’ clients. As part of the visit he and Jason were able to get time with Todd Schauman to record a podcast episode. Todd is the Director of Marketing and Analytics for the Christian Science Monitor and has a great deal of experience working in the publishing space. One of the things that Jim and Jenn have talked about on a previous episode was that there are many tried and true best practices when it comes to ecommerce analytics but not with publishing. We ask Todd his thoughts on why? With the cost of implementing, maintaining, and running an analytics practice, how does Todd make the case to his management? What are some unique options that are available to non-profit publishing organizations?
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Oct 26, 2018
Friday Oct 26, 2018
Adobe recently announced the end of life plans and sunset date for DTM. With migrations from DTM to Launch estimated to take up to 9 months it means that if many organizations are not talking about migrating they could already behind the 8 ball. The time between now and DTM’s sunset date will fly by. What should organizations do to get things started? What are the most important things to consider?
This week Jason, Jenn, and Jim are joined by special guest Corey Spencer. Corey is Director of Product Management at Adobe and was gracious enough to sit down with us to talk about the move from DTM to Launch and all of the benefits that brings to an organization.
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Oct 05, 2018
Friday Oct 05, 2018
Once a testing and optimization program is up and running, what are the key components of making it successful? What needs to be done to ensure that it’s advancing and becoming more sophisticated? Last week, Jason, Hila, and Jim spoke about the important steps to take when starting a testing and optimization program. This week they continue the conversation but focusing the common pitfalls that can cause a test program to become stagnant or even fail.
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/
Friday Sep 28, 2018
33 Tangents - Episode #22 - Starting a Testing and Optimization Program
Friday Sep 28, 2018
Friday Sep 28, 2018
This week, Jim and Jason are joined by Hila Dahan, 33 Sticks co-founder and testing & optimization extraordinaire, We’ve heard time and time again from clients and prospects alike how they want to get a testing program started or they’ve attempted in the past and have failed to get it off the ground. They’re super excited about the prospects of what it can do so they’re focused on the sexy things they can do when it’s up and running once it’s up and running however, they’re not focused on the foundational pieces that are critical to do first to make sure all of that is possible. What are the key first steps in setting up a testing program? What are common missteps and misconceptions in the early phases?
Website: www.33sticks.com
Email: Podcast@33sticks.com
Twitter: https://twitter.com/33Sticks
Facebook: https://www.facebook.com/33sticks/