All Collections
Patch Management
Running Ansible remediation automatically
Running Ansible remediation automatically
Updated over a week ago


This document explains how to run Ansible remediation scripts coming from Vulcan platform through AWS S3 Bucket automatically using a routine schedule.

Combined with Vulcan playbooks this provides zero-touch remediation for Ansible enabled hosts.


choose a host on your network that would run the remediation scripts

  • network access to remediation targets

  • Network access to AWS S3 bucket

On the host running the script verify the following are applied:

  • the system has python installed on it with version >= 3.6

  • the following python packages are installed:

  • pip3 install boto3==1.9.126

  • pip3 install ansible==2.7.10

  • Create an new directory to place the script

  • Copy the script at the end of this article and create the file

Running the script

To run the script manually, use the following command:

python3 NEW_DIR/  --bucket-name=BUCKET_NAME --api-key=API_KEY --secret-key=SECRET_KEY


  • 'NEW_DIR' is the name of the new directory you have created and placed the script '' in it.

  • 'BUCKET_NAME' is the name of the S3-bucket with credentials for uploading files and downloading, using:

  • 'API_KEY' the AWS api key

  • 'SECRET_KEY' the AWS secret key

For adding a cronjob task, run:

crontab -e

and insert (type "i") the following task (paste it, press "ESC" and save after using "wq!" command):

For adding a cronjob task, run on a terminal:
30 2 * * * /usr/local/bin/python3 NEW_DIR/  --bucket-name=BUCKET_NAME --api-key=API_KEY --secret-key=SECRET_KEY

*This would run the task every day at 02:30 (for example).


  • Log for each run is generated and uploaded to the S3 Bucket.

  • A folder with the bucket name would be created in the first run, to which the .yml Ansible playbooks would be downloaded (along side to the host configurations files and the log files)

  • To re-run something - simply remove it from the above mentioned folder (or remove the bucket folder).the script will copy and rerun the ansible script. This would not effect anything on the S3 bucket. script:

import datetime
import os
import sys
import time
from argparse import ArgumentParser
from io import BytesIO, StringIO
import boto3
from ansible.cli.playbook import PlaybookCLI
parser = ArgumentParser()
parser.add_argument("--bucket-name", type=str, required=True, dest="bucket_name",
help="Please provide an S3 Bucket name")
parser.add_argument("--api-key", type=str, required=True, dest="api_key", help="Please provide a AWS S3 Api-key")
parser.add_argument("--secret-key", type=str, required=True, dest="secret_key",
help="Please provide a AWS S3 Secret-key")
def download_bucket(resource, bucket_name, dest_dir):
# Create a map of the new files we need to run
yml_to_host_path_map = {}
# download file into current directory
my_bucket = resource.Bucket(bucket_name)
for s3_object in my_bucket.objects.all():
_, filename = os.path.split(s3_object.key)
file_path = os.path.join(dest_dir, filename)
# Download only new files from the bucket
if not os.path.exists(os.path.join(final_directory, filename)):
my_bucket.download_file(filename, file_path)
if not (filename.endswith('.yml') or filename.endswith('.cfg')):
vul_id = str(filename).split('_')[1].replace('.cfg', '')
if vul_id not in yml_to_host_path_map:
yml_to_host_path_map[vul_id] = {}
if filename.endswith('.yml'):
yml_to_host_path_map[vul_id]['yml'] = file_path
yml_to_host_path_map[vul_id]['host'] = file_path
return yml_to_host_path_map, my_bucket
if __name__ == '__main__':
# Parse args
args = parser.parse_args()
# Start log file
ts = time.time()
st = datetime.datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
log_file_name = f"Vulcan Log file {st}"
log_text = f"{log_file_name}\n"
old_stdout = sys.stdout
sys.stdout = my_stdout = StringIO()
# Create AWS S3 credentials for downloading the bucket content
_bucket_name, api_key, secret_key = args.bucket_name, args.api_key, args.secret_key
resource_res = boto3.resource('s3', aws_access_key_id=api_key, aws_secret_access_key=secret_key)
# Create a temp folder to store the playbooks
current_directory = os.getcwd()
final_directory = os.path.join(current_directory, f'vulcan_ansible_playbook_{_bucket_name}')
if not os.path.exists(final_directory):
log_text += f"Created dir: {final_directory}\n"
log_text += f"Dir already existed: {final_directory}\n"
# Run all yml files
success_plays, bucket_obj = 0, None
log_text += f"Starting to download files from bucket {_bucket_name}\n"
new_yml_host_map, bucket_obj = download_bucket(resource=resource_res, bucket_name=_bucket_name,
log_text += f"Found {len(new_yml_host_map)} new playbooks to run\n"
for vulnerability_id, yml_host_files_dict in new_yml_host_map.items():
log_text += f"Running playbook vul_id {vulnerability_id} on yml {yml_host_files_dict['yml']}," \
f" on hosts {yml_host_files_dict['host']}\n"
cli_args = ['playbook', yml_host_files_dict['yml']] + ['-i', yml_host_files_dict['host']]
cli = PlaybookCLI(cli_args)
success_plays += 1
except Exception as error:
log_text += f"Got an error: {'{}: {}'.format(type(error), error)}\n\n"
except Exception as error:
log_text += f"Got an error: {'{}: {}'.format(type(error), error)}\n\n"
log_text += f"Done with success on {success_plays} playbooks\n\n\n"
# Upload log file to bucket
sys.stdout = old_stdout
if bucket_obj:
# Add system stdout
log_text += f"************Python Log**************\n\n{my_stdout.getvalue()}\n"
binary_text_to_insert = str.encode(log_text)
bucket_obj.upload_fileobj(BytesIO(binary_text_to_insert), f"{log_file_name}.log")

Did this answer your question?