Evading CrowdStrike Falcon Using Entropy

By Red Siege | April 11, 2023

from Mike Saunders, Principal Consultant


You’re encrypting your shellcode so you don’t get caught, and that might get you caught.


I’ve encountered CrowdStrike Falcon Protect on engagements many times. CrowdStrike uses many different methods to inspect processes and block payloads. Recently, I ran into Falcon’s use of measuring entropy to make a decision on whether or not to allow a binary to run. In this blog we’ll discuss how that detection works, and how to get around it.


AV/EDR may use many different methods of analyzing a binary in order to make a decision about it. This includes dynamic analysis — analysis of the target while it is executing — and static analysis — analyzing properties of the target. Static analysis may include analyzing the imports table to determine the functions and DLLs used by the target, analyzing strings and byte sequences embedded in the target, the MD5 or SHA256 hash of the target itself, and may also include an analysis of entropy in the target. These static analyses are used to determine if a target is trustworthy for execution.

This blog concerns Falcon’s static analysis phase. The payload I created also required obfuscation and evasion techniques in order to run properly. I will not be discussing those techniques here.


Entropy is a measure of the amount of randomness in a set of data. When talking about entropy and information security, we’re usually talking about Shannon entropy. Shannon entropy can be described as a measurement of the amount of randomness in a given subject (executable, image, text, etc.). While not a perfect method, a quick way to measure the amount of entropy in something is to see how compressible it is (See also: Kolmogorov Complexity). A file that contains primarily random data is said to have higher entropy and is consequently less compressible. A file that contains non-random data, such as written language, or a digital photograph, is said to have lower entropy and can be compressed more easily.

Consider these two examples. I took a copy of the U.S. Constitution and compressed it with gzip. As you can see, the size of the file was compressed by 67.3%.

% gzip -v -c constitution.txt > /dev/null
constitution.txt:     67.3%

I then compressed a block of random bytes the same size as the Constitution file. This random stream of bytes was compressed by only .2%. From these two examples, we can see that non random data, such as written language, is highly compressible — meaning the entropy is low — while random data is not.

% dd if=/dev/urandom bs=1 count=26374| gzip -v -c > /dev/null
26374+0 records in
26374+0 records out
26374 bytes transferred in 0.043034 secs (612863 bytes/sec)

One common source of entropy in a payload is the use of encryption to obfuscate the embedded shellcode payload. Consider a typical shellcode launcher used to launch a Cobalt Strike beacon or a Meterpreter shell. A launcher that doesn’t use encryption to hide the shellcode will have less entropy. Without encryption, however, the shellcode is also easily recognizable to most AV and EDRs. When the shellcode is encrypted, the amount of entropy in the launcher increases. An AV or EDR that considers entropy in its decision process may flag the payload as potentially malicious and deny execution because the entropy is higher, signaling that the target may be trying to avoid detection.

Entropy calculation with ent

You can also check the entropy of a file using the Linux ent command. Here is the (trimmed) output of ent run against the text from the U.S. Constitution.

% ent constitution.txt
Entropy = 4.465227 bits per byte.

Optimum compression would reduce the size
of this 45119 byte file by 44 percent.

The optimal compression ratio is very similar to that of gzip. Notice the Entropy line showing ~4.4 bits per byte. Perfect randomness, and the highest possible entropy, would be 8 bits per byte. A value of 4.4 is quite low. Compare that to the entropy calculation of /dev/urandom.

% dd if=/dev/urandom bs=1 count=26374 | ent
26374+0 records in
26374+0 records out
26374 bytes transferred in 0.075122 secs (351082 bytes/sec)
Entropy = 7.992614 bits per byte.

Optimum compression would reduce the size
of this 26374 byte file by 0 percent.

As we can see, the entropy is much higher (~7.993) and significantly closer to the perfect 8 out of 8. The optimal compression is zero, once again very similar to what we saw with gzip.

The ent command is not installed by default. I’ll use gzip for the remainder of this post since the compression ratio output from gzip is a good proxy for entropy calculation.

Evading detection by decreasing entropy

If an EDR can decide that our payload is potentially malicious by measuring the amount of entropy in the file, how can we decrease the entropy? By adding non-random data! This example shows how I accomplished this in a C shellcode launcher to bypass CrowdStrike’s entropy analysis engine.

In my loader, the shellcode was stored in a variable. I created a second variable containing strings from the English language. The following code demonstrates how the words were incorporated into the payload. Note: this is a contrived example. The number of words required to effectively lower the entropy will be far greater and may vary between AV/EDR vendors.

constant char* words[10] = { 'aaron','abandoned','abdomen','aberdeen','abilities','ability','aboriginal','about','above','abraham' };

I compiled the strings into my payload, but this step isn’t strictly necessary. I could have used cat dictionary.txt >> payload.exe to append the contents of a dictionary file to the payload. One person told me they’ve managed to lower entropy and avoid detection by cating a picture of a cat onto the end of their payloads.

While AV and EDR may look at the entropy in different sections of a PE, such as the resource section, measuring the overall entropy in a program can give us a good understanding of the impact. My initial loader did not contain any extra words to decrease entropy. You can see here that the file was compressed by only 37.8%.

command showing compression of a standard executable is only 37.8%

After creating a new payload and including ~7,500 unique English words, I could compress the file by 98.9%. The file could be compressed more effectively, meaning the entropy in the file was lower.

Gzip command showing compression of an executable concatenated with dictionary words at 98.9%

As a final example, here is the same program containing a simple “pop calc” payload using encrypted Meterpreter shellcode. In this case, the payload worked and the entropy analysis did not deem the payload unsafe because the size of the calc shellcode is quite small in comparison to the size of a stageless beacon shellcode.

Gzip command showing compression of Metasploit shellcode concatenated with dictionary words


Decreasing entropy in a payload can help bypass AV/EDR detections. This strategy applies to more than just compiled payloads. AV & EDR often measure entropy in function and variable names in scripts and macros. Using random variable names such as eafuiaegdt or wukrncmpzd may increase the likelihood of being detected compared to using names such as SparklingSpritzer or BoomingElephant. Using non-random variable and function names has helped me get MSBuild payloads past many different AV and EDR products.

I also want to add that this isn’t a dunk on CrowdStrike. I both love to see clients using it, because it is effective, and I dread running into it, because it is effective! The techniques described in this blog likely work against many AV/EDR products and just happened work against CrowdStrike. Falcon has continued to challenge me and has forced me to continue learning in order to stay effective.

Stay tuned for an upcoming blog on a different method to decrease entropy and evade detection!


Many thanks to @seahop, @ironedEl, and @AsaurusRex in the BloodHoundGang Slack for helping me troubleshoot this issue and pointing me in the right direction.


About Principal Security Consultant Mike Saunders

Mike Saunders has over 25 years of experience in IT and security and has worked in the ISP, financial, insurance, and agribusiness industries. He has held a variety of roles in his career including system and network administration, development, and security architect. Mike been performing penetration tests for nearly a decade. Mike is an experienced speaker, speaking at conferences such as DerbyCon, Circle City Con, regional BSides including Minneapolis , Kansas City, and Winnipeg, SANS Enterprise Summit, the NDSU Cyber Security Conference, and SANS and Red Siege webcasts. He has participated multiple times as a member of NCCCDC Red Team.


Connect on Twitter & LinkedIn

Reject Passwords, Return to (Security) Keys

By Red Siege | November 28, 2023

from Ian Briley, Security Consultant The weakest link in your information security chain will always be the human behind the keyboard. No matter how much death by PowerPoint security training […]

Learn More
Reject Passwords, Return to (Security) Keys

Preparing for a Penetration Test: Insights from Tim Medin, CEO of Red Siege Information Security

By Tim Medin | November 13, 2023

As the CEO of Red Siege Information Security, I’ve had the privilege of building an outstanding team of ethical hackers to conduct numerous penetration tests for organizations across many industries. […]

Learn More
Preparing for a Penetration Test: Insights from Tim Medin, CEO of Red Siege Information Security


By Red Siege | October 5, 2023

In this blog post I wanted to share a few tips and tricks I’ve found in Burp that have really helped me in the past. Double Click and Right Click […]

Learn More

Find Out What’s Next

Stay in the loop with our upcoming events.