AWS Content Sync

Context -
We needed a way to distribute source content (packages / patches / images / drivers) globally (across 70 locations) in a timely efficient manner. WAN bandwidth was limited, especially in the AP region.

Introduction

Some background before we dive in….our current WAN bandwidth at most sites was limited and typically used for Oracle. Any network traffic congestion had the potential to disrupt production elements of our workforce. The advantage here was the ability to use the local ISP which provided a significant improvement over any WAN infrastructure as it related to upload / download speeds / traffic.

Requirements

If you have been following the post prior to this one (Site Builder), you will be familiar with the server preparations that were made. This is basically a forerunner / prerequisite to this process. The end goal was to tie these two together but time was not my friend.

Summary

 
AWS Sync.png
 

Process

  • Security

    • Accessing 70 different servers at 70 different sites across five domains required a secure approach. We used AES encryption through PowerShell to build this out. This approach allowed us to use one service account per domain / multiple servers.

    • Key files were secured via domain ACL.

    • The service account referenced in the Site Builder post was also limited to the source directory on the site distribution server.

    • AWS S3 credentials were referenced at runtime in environment variables.

    • Proxy requirements (where applicable) were included in environment variables.

ServiceAccountSecurity.jpg
2020-06-08_12-03-49.jpg
  • Logging

    • Sync logging would account for each files.

    • Logging was created for all scenarios and would run with each execution.

    • Logging shares were setup based on your access.

    • Logs were purged after 30 days.

    • Any errors encountered would notify the team via an email alert.

ServerShare.jpg
joblogging.jpg
  • Sync

    • The start of the content sync would begin from the Core to AWS. Then AWS to the target site/server. Each type (package / patch / image / driver) would follow this process throughout different times of the day for different regions.

    • Each site task was executed as a job and therefore would run remotely on each target server. As the sync process started all job data was sent back to the core server and recorded in a log file (as somewhat described previously).

Sync.jpg
  • Scheduling

    • Scheduling was automated through the use of Task Scheduling on the Core server. This was based on the ebb and flow of content in the environment. All content was sync’d daily with the exception being images.

    • You have heard of “Follow the Sun” this was more like “Follow the Dark” (off hours).

Ivanti AWS Sync Schedule-3.png
  • Verification

    • We started having some sync issues in AP sites and I decided to include a verification at the end of the sync and alert on any discrepancy. This step would take a measurement on the core and compare this with the destination.

verification.png
  • Kill Switch

    • So with any process like this, there will come a time when you need to stop this distribution. Even though we are not traversing the WAN circuit, someone is bound to be affected. In the event you are in this situation, its a very simple process. Simply kill the Python.exe (AWS Client) process.

KillSwitch-script.png
KillSwitch.png
Next
Next

Site Builder