Most people outside of Marketing Operations don't realize how much effort is involved in troubleshooting, especially if the issue is not specific. Even specific issues can take hours of investigation because the technology ecosystem and people process can be quite complex. Our IT friends will understand our pain.
The audit process can be thought of as a series of troubleshooting exercises grouped together. It is simpler when you have specific issues to investigate. In that case, you should take a deep look at the specific area of the MarTech framework but also take a cursory look at the adjacent areas. Other times "something just doesn't smell right" and that needs some detective work to track down what to take apart and look at. You might be tempted to ignore these vague observations, if they come from someone you trust and they have a long history with these processes then they may be seeing something that isn't easy to articulate or explain. It may be the tip of an iceberg so take time to understand what they are thinking.
If there is a question about the system then there is something that needs correcting. Whether fixing a misunderstanding or an actual bug, both are important in maintaining the trust that things are working as expected. If it's a misunderstanding on the issue reporter's part, updated training material or documentation (wiki, case resolution suggestions, etc) and in the short term you might want to open office hours or lunch and learns to educate your users. An actual bug should always be prioritized to be fixed but sometimes there are issues in the grey area where there is conflicting logic/requests from different teams. When these come up you should pull in the teams (and often the common superior) to sort out the logic and process.
In the Marketing Optimized MarTech framework, we have already established a way to group marketing functionality and processes. We use that same framework for grouping auditing questions. Generally, the questions boil down to a set of questions about:
flow rate
quality
trend (historical comparison)
Sometimes you may want to compare your stage conversions percentages to industry Avg and "Best in class" to gauge how well you are doing comparatively and if there is something that looks completely out of alignment. There could be something that needs attention or it could be that what you are using for funnel stage definitions don't closely match those in the external metrics which is why historical data is the most accurate for insights.
Here are some suggested questions to look at based on the area of concern. If you don’t have a specific area identified, I suggest working backward from stage 8 to stage 1. The closer the issue is to the deal close event the more obvious the impact of a fix/change will be. This should get you started and if you discover an issue the next set of questions is why is there a failure (flow rate, quality, trend) in that area.
(Share issues and question you find useful to look into in the comments section)
Comments