Strategically using MarTech audits // Brandi Starr & Mike Geller — Tegrita

We continue our conversation about using MarTech tools and strategies to put more money in your business's pockets. Joining us again today are Brandi Starr and Mike Geller, the COO and President and CTO of Tegrita, which is a full service MarTech consulting firm that enables digital marketing strategy with technology. In Part 2 of our conversation, we discuss strategically using MarTech audits.

Show Notes

Quotes

  •  “We can talk about how there’s a blending of marketing, sales, and customer success but in reality, if you can’t get the flow of data right between all of those teams, it doesn’t really matter if we’re trying to be unified. That’s where your understanding of the MarTech stack becomes really important. You can’t fix it if you don’t understand what you already have.” - Ben  “There are three steps. Number one is cataloging the technologies, next is defining the use cases, and finally gathering team sentiment and scoring each of the technologies.” - Brandi  “Cataloging seems self-explanatory but you would be surprised at how challenging this process can be. The first part in cataloging the technology is making a list of everything that you own. If there’s a contract for it, you put it on the list. In cataloging, you need to identify who owns the technology, which team, or ideally, if there’s an actual person of the person accountable.” - Brandi “You want to outline all of the keywords in terms of types and functionality. For example, a marketing automation platform can send emails, handle landing pages, can do forms so each of those would be keywords in terms of the functionality because later we will want to understand the overlap. It’s really about pulling in any details about what the technology is and what it can do.” - Brandi “If there are tools used by the individuals because they liked the tools but there is another tool that is sanctioned by the organization for the same purpose, I think of it as both serving the same specific use cases. When we do an audit, we try to catch the official and unofficial that people are using and that may be a little harder to uncover because that is less known and you have to talk to multiple people in order to get a full picture of the technology.” - Mike “We can’t really catalog everything so if there’s a line to be drawn it is at the customer. Does this tool impact the customer experience either directly or relatedly? If it doesn’t we don’t really care about it from a MarTech standpoint. Does it touch the customer or not, that is the Litmus test that we use. ” - Mike “In looking at the use cases, we want to focus on what is the job to be done by each technology. What purpose is it solving within the process? and we want to make sure that as we are evaluating technologies, we are evaluating it against the job to be done.” - Brandi  “Focusing on the objective and deriving use cases from that would be the way to go. It sounds obvious but when doing a technology assessment we tend to gravitate towards feature and that leads to this false thinking of trying to maximize features in order to align them to the use case. Like the more platform I use, the better the outcome and that is simply not true.” - Mike  “In doing an audit, I always recommend actually having conversations with those stakeholders who are involved in the technology. Included in that would be Systems Administrators of each of the technologies as well as power users. The people who live and breathe the technologies and have a good insight into how effective it is for them.” - Brandi “In some cases, you can try and leverage a survey to try and gather team sentiment to be able to give different data points but even if you are doing some type of survey, I do so recommend actually having conversations and being able to have everyone rank the sentiment in terms of how they feel about that technology. There are some things that do the job really well but everyone hates the platform so that’s not effective.” - Brandi “I use the alphanumeric scoring methodology. Using the actual catalog and defined use cases, that is where I assign an A, B, C, D. You can get really granular and build out a true point system especially if you are a larger organization with a larger team and smaller companies with fewer technologies, you can do it more anecdotally.” - Brandi

Up Next: