Trustwave Blog

Think Pink

Written by Tanya Secker | Jul 10, 2024

There are some people who say, "I already conduct red team exercises, why would I need something different that is nothing more than a watered-down red team?"

I'm here to change this line of thought.

The concept of white box assessments is not new in offensive security, and mixing colors is not new to red teaming or the industry either. The combination of red and blue gives you purple. Combine white with red, and yes, you get pink.

I have started to define a pink team as an exercise that starts with ideally some level of initial access granted to the client's environment, with or without some insider knowledge. A combination of a red team exercise and a white box assessment.

With this in mind, we can look at a pink team as an assumed breach or insider threat.

Essentially, the team goes into the exercise with an "in" into the environment. Unlike a traditional full-blown red team event, where we have to find a way into the target environment. In a pink engagement, the client grants a level of access as a condition of the exercise.

 

Why be Tickled pink?

The fact is red team engagements can start to lose their effectiveness if they are not refined accordingly. We tend to find three types of organizations that opt for a pink exercise:

  1. Those that have run and successfully defended against red and purple team engagements.
  2. Those who want to jump-start an attack simulation process.
  3. And those who are looking for cost efficiencies.

Let's take a look at number 1.

A pink team is a natural next step in an organization's full testing simulation journey, basically a way for a client to get more out of the testing process with more refined scenarios.

With a traditional red team exercise, we can run into a dead end once we have performed multiple assessments and then worked closely with a client's blue team (via a purple team test) to detect attacks and stop them before any level of compromise is achieved. Generally, we and the client often find that recon/OSINT are no longer turning up much value, and phishing campaigns, with or without spears, are just getting canned before hitting inboxes or being reported through the correct channels.

Such results basically show that their external security and processes are in good shape to deflect an inbound attack, but that does not mean they are safe from incidents that start from within or even via a trusted third party.

Reason number two includes clients who still need to do fully-fledged red team exercises but instead want to obtain some quick insights into their preparedness level from an internal perspective first. A pink team allows us to quickly spin up some scenarios with a smaller set of objectives and, therefore, less time-consuming planning and research.

Finally, the third reason to choose a pink team is that its setup and process are faster and require fewer resources, so they tend to be less expensive but still deliver extremely valuable information.

 

Setting Up a Pink Team Scenario

The first step is to sit down with the client to decide on the starting point and objectives. Basically, what they want us to cover and the level of initial access they will grant.

Customer maturity is normally a key trigger in affording assessments a greater level of access and information to provide more value.

For example, a typical application assessment will be performed with a credential pair for each role, a full dataset within the application, including file upload examples, and a recursive backend file and directory listing.

Consultants are often even provided with demos to showcase the workflows and logic underpinning what is expected in usability. An empty shell of an application with very limited knowledge of how it works and without being provided with authenticated access would not yield much.

However, having all this information and access from the get-go means the tester can focus on finding a way to circumvent access controls, bypass authentication, or logic without wasting their time inputting data or understanding how it is all stitched together, greatly speeding up the process.

In any time-boxed test, it is a lot easier to find an authentication bypass if you already have access or logic flaws if you walk in with a clear understanding of expected behavior.

Companies improve their information disclosure policies, perimeter protections, actions, and responses through technology, people, offensive security assessments, and attack simulations such as red teams. With maturity, the client should then look to conduct a pink team scenario to cover those situations where a threat actor has already gained some sort of a foothold and/or obtained critical information to leverage an attack.

Whilst this does not negate the need for the client to conduct a full-blown assessment, it is a good time to focus some assessments on insider threats via a think-pink approach.

Say a disgruntled employee with knowledge of the deployed security controls who also has a high level of access to critical systems and in-depth knowledge of the security policies to boot. Or even just a resident threat actor or employee with some limited access to internal systems.

With these scenarios in mind, we can provide more refined, cost-effective approaches with specific objectives to meet the organization's current needs.

We can look to cull most of the typical initial stages of a red team at certain intervals whilst we concentrate on the latter ones, where we start by mimicking that insider who resides on the internal network that may already be furnished with a way to exfiltrate data and go undetected or is more able to find one.

Moreover, we can look to provide a more realistic attack scenario as a pink team aligns itself more closely with how criminals actually operate, as they are not bound by some of the inherent limitations the industry operates within. After all, they are not confined by time, scope, or general ethics, which will always give them the upper hand in gaining a degree of access from zero.

For mature organizations, gone are the days of constant but sometimes needless use of the more black box approach with its roadblocks for attack simulations, which will not always feed the needed intelligence and improvements required to stay ahead of the game.