We ensure that every single implementation we do is fully documented down to every change that was applied to each requirement using our very own platform TagPipes, which seamlessly does most of the hard work.
In retrospect, most of the new work we get from a new client is usually an audit which almost always leads to a major overhaul of the existing implementation, as it never fails, they are usually what I call fire-and-forget. Absolutely no documentation, planning, or careful application of the in-outs of the clients’ business. It’s a functional implementation – it works, data is flowing in, we scrapped everything, bye!!!
There are many aspects of what we do to resolve these issues but I wanted to focus this article on just the stages that each requirement encounters in our process which ensures complete coverage across all involved parties.
Some of these may overlap or skipped based on your needs.
NEW
All requirements start at new, at this stage, just the basics are added, maybe a title, some description, and notes. We spend most of our time on this screen when we are with the client gathering details about requirements at a high level and sometimes reviewing design comps.
ANALYSIS
At this stage, our analysts will review the requirement and determine the best form of action to properly track what the business needs. This includes adding variables, scenarios, and more. This step is a very crucial step if not the most crucial step during our process as everything else can be derived automatically or manually given the information we collect from this step.
It’s also when we input technical specs as needed especially if we are using a data layer which we always recommend except if the client prefers otherwise.
DEVELOPMENT
If we are doing a data layer (recommended), the requirement is then passed on to the development team which for us is mostly the team at the client as they are most agile to complete this step otherwise it gets complicated if we have to get in and examine their codebase.
This is usually the only stage that our hands are tied and after it passes the next level we can work independently to complete the project without any delays or dependencies.
DEVELOPMENT QA
During this stage, the development team has completed their tasks and the requirement is then past back to us for our QA team to test and validate their work.
IMPLEMENTATION
This is when we do the actual implementation, meaning such as making server calls to the analytics platform of choice, adding marketing/media pixels, and more.
IMPLEMENTATION QA
Our QA team will then perform another round of testing with a heavy focus on outbound data into the analytics platform of choice.
USER ACCEPTANCE TESTING (UAT)
The client starts seeing some results, at this stage they can log into the platform and start playing with the data.
PRODUCTION
We perform a production deployment when approved by a client. This is usually done with a tag manager such as Adobe DTM/launch, GTM, Tealium, Ensighten and etc.
PRODUCTION QA
Once again it’s time for our QA to perform a regression on the project but on the production environment as seen by actual customers.
COMPLETED
It’s all done and passes all tests. At this time our analysts are excited as they can now log in and provide strategic business intelligence that is truly based on the data we’ve collected.