Tag Inspector - Tag Auditing Competitive Landscape

The Tag Auditing Competitive Landscape: What You Need to Know

Lucas LongTag ManagementLeave a Comment

There’s no lack of website performance monitoring tools available these days. They run the gamut from simple to complex, and allow brands to do everything from measuring site speed to performing tag management.

And the market is growing significantly. Gartner reports that the global business intelligence and analytics market reached $18.3 billion in 2017 – a 7.3 percent increase from 2016. And with roughly six million developers worldwide working on big data and advanced analytics, you can bet that these tools will only become more robust and sophisticated in the future.

When it comes to tag auditing tools, you’ll find they vary widely in terms of their capabilities and applications. But when you boil it all down, there are two primary ways that these tools collect information and measure performance: using real or simulated data.

Each has its pros and cons. Here’s a look at what you need to know.

Tag Inspector - Tag Auditing Competitive Landscape (2)

This first type of performance monitoring tool simulates a connection to a website and crawls all its pages. Also known as synthetic scanning, it involves the creation of behavioral scripts, which are a series of behaviors and actions that are expected to occur when a customer is using a site.

The primary benefit is that it can connect to any URL remotely, eliminating the need to install anything on a site.

Another advantage is that it allows IT professionals to pinpoint performance issues (and correct them) before actual customers do.

And because this form of monitoring doesn’t require real website traffic, it can be done pre-launch. This makes it advantageous for companies that are looking to test their sites before deployment.

The main con is that the data doesn’t represent real live data from real live users, so you may encounter gaps in the data you generate and what actually occurs in a real-world environment.

It can also be an arduous process because you have to configure your website to run scans for every type of scenario. This is often problematic because there can be hundreds – if not thousands – of different user types and characteristics, including:

  • Technical profile
  • Device
  • Browser
  • Cookies
  • Plugins

Also, crawlers can’t typically test protected pages or thank you pages.

This second type of tool runs on a live website and monitors the behavior of tags while live users are on the site. It’s unscripted, which means you’re generating real data in live situations.

As a result, you know exactly what real users are doing, which eliminates any questions marks. Whenever you’re alerted to an issue, it’s one that your actual users are experiencing.

This simplifies the data generation process because you don’t have to create a simulation and deal with the often-onerous task of configuring your site to run scans.

An added plus is that you’re also able to collect a larger volume of data than you typically could with a simulated scan. The more data you have, the more patterns emerge and the quicker you can resolve critical issues. In turn, you can create a more robust and comprehensive data set to answer pressing questions.

The main drawback is that you need to deploy a tag onto a website in order to collect data.

Another drawback is that you need to have actual web traffic in order for it to work. It’s not viable for pre-launch testing, so users may encounter problems before you have a chance to remedy them.

Real user monitoring may also be limited if your site has a low volume of traffic.

There are three primary use cases for these performance monitoring tools.

1. Tag Performance

  • How are tags affecting the performance of your website?
  • What is the volume of requests being sent by those tags?
  • How much bandwidth are those requests using?
  • How can you optimize your tagging architecture to balance website performance and data collection?

These are just a few of the questions that can be answered through tag performance monitoring. Tag Inspector is one of the more effective platforms for doing this, and can be done via real user monitoring as well as synthetic scanning – whichever best suits your company’s needs.

Synthetic scanning involves:

  • A website crawler simulating user sessions
  • Scheduled or on-demand scans
  • Scheduled scans to provide ongoing tag behavior monitoring

Real user monitoring involves:

  • Monitoring live tag behavior
  • Real-time data validation, monitoring and alerting
  • Validating checkout funnels and conversion “thank you” pages

Although there are two different product options, it’s important to note that monitoring tags in a live environment has some distinct advantages because you’re generating the data as it should be.

With real user monitoring, you can reduce the discrepancy from your analytics platform to backend system from 15 percent to 5 percent, while a simulation may only help you reduce the discrepancy from 100 percent to 10 percent.

Tag Inspector is so effective because its focus is on tag behavior and how to optimize tags. Other tools may perform similar functions, but they’re not necessarily a point of emphasis.

Tag Inspector is also able to transform large quantities of raw data in a way that makes it easy to digest and understand performance issues. As a result, even the most complex data can be converted into practical, applicable information, which paves the way for more efficient tag optimization.

2. Data Collection Validation

This revolves around three key areas:

  • Analytics setup
  • Analytics architecture
  • Alerting you when anything breaks

A popular tool for data collection validation is ObservePoint. It works by auditing your website and looking for any inaccuracies in the data collection process. Some factors that could contribute to inaccurate data include leakage and corruption.

The end goal is to ensure that your data collection process is functioning as intended so that your company can generate higher quality data and ultimately fine-tune your decision making.

While it does have an aesthetically pleasing UI, you are required to configure the simulations, which can be an onerous task. And because it’s primarily simulated data, you’re also limited to the number of user characteristics that you can analyze.

Another tool is New Relic, which places an emphasis on dynamic infrastructure monitoring. The logic is that developers are constantly releasing updates and new technologies are continually being adopted. In turn, this can create issues with data collection and result in kinks in your digital infrastructure.

New Relic’s main purpose is to provide real-time “health metrics” for your systems so that you’re always aware of what’s going on. It utilizes tag-driven alerting to notify you whenever something breaks so you can take steps to quickly resolve it. Therefore, your data collection remains intact, and you’re far less likely to make poor decisions based on inaccurate data.

3. Privacy Governance

The final way to utilize tag performance monitoring tools is for privacy governance. Data privacy is a huge concern in today’s world. In 2016 alone, U.S. companies and government agencies fell victim to 1,093 data breaches – a 40 percent increase from 2015 and an all-time high.

Laws like the European Union’s General Data Protection Regulation (GDPR) are creating strict guidelines on how organizations handle and monitor their customers’ personal data. Failing to comply can result in stiff penalties.

Some common questions that can be answered by utilizing tag performance monitoring tools for privacy governance include:

  • What tags are on your website?
  • How are they getting there?
  • On what pages do they load?
  • Does the behavior follow your tag governance policy?
  • Are there any unauthorized tags loading?
  • If so, where?

Ghostery is one of the primary tools used for privacy governance. Although it’s mainly geared toward consumers who are looking to enhance their browsing, protect their data and gain control over the ads they’re exposed to, it has an application for brands as well.

For instance, it helps companies ensure that they’re compliant with disclosure regulations and relevant industry standards such as the GDPR.

Its primary flaw is that it mainly supports simulated users and offers limited functionality for live users. As a result, you’re not always able to get a clear picture of all the nuances.

Tag Inspector, on the other hand, collects live data via a tag deployed on a client’s site.

Conclusion

Business intelligence and analytics tools have evolved dramatically. They’re now capable of analyzing nearly every facet of operations and generating incredibly detailed data.

Tag performance monitoring tools are particularly sophisticated. They collect data and measure performance through either simulated scans or monitoring the behavior of tags from real users.

In turn, they provide comprehensive reporting on tag performance, data collection validation and privacy governance, all of which are critical for optimizing your website/overall digital infrastructure and remaining compliant with critical data regulations.

Tag Inspector is capable of addressing all three areas and allows you to monitor live tag behavior, which is a tremendous asset. Other tools like ObservePoint, New Relic and Ghostery also have their strengths but are a bit more limited in their abilities.

Which aspects of tag management is your organization most concerned with? Share your thoughts in the comments below: