The Problem
Our large telecom client was working through a multi-year project transitioning millions of customers across to another system. Despite having a plan in place to move data across, employees were stuck waking up in the middle of the night to monitor processes, check for new files, run complex custom queries, transform files, then distribute results several weeks each month during the process. All of the tasks that could be easily automated already had been, but what was left involved spanning multiple backend systems, applying complicated business rules that frequently changed, and according to published scheduled that often changed.
The Solution
The Eigen X team built a solution to automate remaining processes, monitor files according to established patterns during peak processing times, move dozens of gigabytes of data across during an evening, all fully automated. The team authored C# code to automate legacy mainframe green screen processes, built a custom C++ ODBC driver to pull data from a legacy system, and wrote a Java backend to manage schedules, queue jobs, and monitor directories and sites. Custom PowerShell and Perl scripts validated incoming and outgoing files against established business rules. Tools were supplied to allow the business users to control the overall process in terms of scheduling, email notifications, and specific file types.
The Result
The custom ETL solution creatd by the Eigen X team saved hundreds of hours of labor for key employees during overnight batch processing, improved accuracy by minimizing the chance for human keystroke errors, minimized expensive downtime in systems used by hundreds of people, and improved data quality by placing additional safeguards in place to catch potential errors before they were distrubted outside of the organization.