Why Websites Today are Like Boxes of Chocolates – You Never Know What You’re Gonna Get

Adam Maruyama

By Adam Maruyama

Commercial

Government

This summer, the Polyfill[.]io attack, which used a corrupted javascript library to redirect users to adult and gambling websites from legitimate domains including those belonging to Warner Brothers, Hulu, Mercedes-Benz, and others, put the risks of a webcode supply chain compromise into a stark light. And while the consequences of this attack – the display of objectionable web content – were relatively mild, it’s easy to envision a more sophisticated attacker using a similar supply chain compromise to direct select users to more dangerous content, like a zero-day browser attack.

But while rigor and recommendations around software bills of materials (SBOMs) have been developed in the aftermath of the 2020 SolarWinds software compromise and the 2021 Log4Shell vulnerability, most enterprises remain blind to the composition of the websites that their employees and contractors visit using company systems. Although proxies and secure web gateways (SWGs) may provide administrators with insights into whether individual users accessed compromised resources, they provide little to no holistic insights into the original websites users were trying to access when the compromised resources were served to them.

As a result, conducting a retrospective analysis of webcode supply chain attacks is more complex than it needs to be, and – critically – the type of proactive and holistic analysis associated with SBOM risk evaluation is nearly impossible. At the same time, web browsers and the web apps they run are gaining more and more functionality and require more hooks into system-level processes. The combination of these two features – an Internet full of sites of uncertain composition and a web browser that allows them access to system-level processes – certainly calls into question web browsers’ and webapps’ compatibility with the “applications and workloads” pillar of zero trust.

Taking a page from the SBOMs required in the aftermath of SolarWinds and Log4Shell, Garrison’s Trust Qualified Browsing (TQB) platform includes the capability to dynamically generate web bills of materials (WBOMs) for sites accessed by an organization’s users. By leveraging intelligence taken from the browsing sessions of individuals within an organization, WBOMs can provide specifics of what resources are required to display a given website, including variables like which CDNs are called based on a user’s location or IP address.

In the event of a widespread compromise like the Polyfill[.]io example, these organizationally specific insights allow cybersecurity teams not only to identify that compromised content had been accessed, but also to trace that content back to the website that called for it. This level of granular detail is especially important if the compromised content was hosted on a CDN or other dependent service rather than the domain associated with the website. These insights could allow cybersecurity teams to better assess whether a compromise was a result of organization-specific targeting, or simply part of a broader attack campaign.

Far from simply providing more accurate and organization-specific incident response data, however, WBOMs allow proxy teams to take a more granular view of managing risk in their systems. For example, the dependencies illuminated by WBOMs could enable proactive identification of malicious content associated with a “watering hole” attack, where a legitimate site is compromised by adversaries looking for a trusted way into a target’s systems. Using a WBOM to identify a callout to an anomalous domain or a resource hosted in a hostile geography could provide a valuable indicator of such an attack.

If you’re interested in generating your own WBOMs or learning how Garrison TQB can provide a centralized solution for hardware-enforced secure web access, contact us today!