Welcome to my page.

George Amartey Adjaidoo                

301-452-2385

gadjaidoo@gmail.com

https://www.linkedin.com/in/george-a-7861428b

 

 

 

 

Professional Summary:

·         Over 15 years of experience in database analysis, database design and development in SQL 2014/2012/2008/2005/2000 in production and support environment

·         Expert in SAS

·         Expert in Shell Scripting Linux, Unix systems

·         Expert in PowerShell

·         Expert in Automation Solutions

·         Experience in High Availability / Disaster Recovery (HADR) solutions; Failover Clusters, AlwaysOn Availability Groups, REPLICATION, Log Shipping Mirroring.

·         Experienced in Upgrade / Migration of MS SQL SERVER 2008 TO MS SQL SERVER 2008 R2, FROM SQL SERVER 2012 SQL SERVER 2014.

·         Experience working WITH application team implemented databases as per the specification FROM LOB applications.

·         Experienced in installation, configuration, maintenance optimization of SQL SERVER HADR solutions.

·         Experienced in BACKUP Recovery, Query Optimization Security.

·         Experienced in monitoring the health performance of SQL servers as well as home grown applications using SQL SERVER native tools several other third-party solutions.

·          Adept at T - SQL, stored procedures, tables, indexes, VIEWS, functions, triggers.

·          Expert in creating DTS packages TO utilize SSIS functionalities

·         Strong knowledge of Relational Database Management Systems (RDBMS) concept

·         Experienced in designing complex reports including sub reports and formulas with complex logic using SQL Server Reporting Services (SSRS)

·         Worked on activities related to the development, implementation, administration and support of ETL processes for large-scale Data Warehouses using SQL Server SSIS.; Change Data Capture, Slowly Changing Dimensions, etc.

·         Experienced in data import/export, data migration between homogenous system using Data Transformation Service (DTS) Packages, Bulk Copy command (BCP) and Bulk Insert.

·         Experienced in Data Warehouse/Data Mart methodologies; Kimball vs. Inmon

·         Experienced in XML data exchange; XBRL

·         Experienced in XML to SQL Server data base; using SSIS

·         Experienced in XML messaging

·         Experienced in XML Shred and Writing using XQuery, XPath

·         Experienced in SQL service broker

·         Experienced in Advance T-SQL querying programming, string manipulation, dataset manipulation like Dynamic Pivot, UNPIVOT, Cross Apply, CTE, Cursors, while loops, Etc.

·         Experienced in Advance analytic functions; windowing functions in SQL Server (LEAD, LAG, PERCENTILE_CONT, ROW_NUM)

·         Expert IN using DTA / Extended Events / PerfMon / SQL Profiler / Scripts and other third-party tools to resolve bottlenecks, dead locks

Technical Skills:

Databases:                      SQL Server 2008/2012/2014

Business Intelligence:   SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS)

OLAP:                                SQL Server Reporting Services (SSRS), SSIS, Crystal Reports 9.0/8.5, OLAP, Erwin, Tableau, Power BI

IT Processes:                 Software Development Life Cycle (SDLC), Agile, Waterfall, Iterative

Methodologies:           SAFe, Agile, Waterfall, Scrum, SDLC

Management Tools:   Tortoise SVN, Visual Source SAFe (VSS), Team Foundation Server (TFS)

Software Applications: SQL Server Management Studio (SSMS), Microsoft Visual Studio 2008, Toad for SQL Server, MS Office, C++, C#, Python, Java, Bash (Unix shell), XBRL (extensible Business Reporting Language), XML, SAS 9.4, Arrelle, EDX- Querying with Transact-SQL, Xcelerate Client Tool, Cygwin, Tortoise SVN, Talend, Tableau Desktop, Altova, Salesforce, Windows PowerShell

Hadoop Ecosystem:    HDFS, Map Reduce, Hive, Pig, HBase, Sqoop, Oozie, Elasticsearch.

Operating Systems:    Windows 2000/NT/XP/7, Windows Server 2003/2008

 

 

Education:

·         B.Sc. Accounting - University of Maryland, College Park, Maryland, August 2009

·         Associate of Arts Degree in Information Systems/Accounting, Montgomery College, Rockville, MD, 2002

 

Certifications:

·         Microsoft Certified Professional-Database

·         EDX (MOOC) - Querying with Transact-SQL Certified

·         Databricks Certified

 

 

Professional Experience:

Johns Hopkins (TIC), Maryland              September 2023 – January 2024

The Technology Innovation Center (TIC) is uniquely positioned within Johns Hopkins to help design, build, deploy and analyze novel enterprise, departmental, and clinical applications that strive to improve workflow, outcomes, and patient care.

 

Sr. Software Engineer – Precision Medicine

•     Design ETL architecture based from multiple sources including Enterprise EMR (Epic), researcher curated data, and other IT system to Databricks/ADF

•     Develop/change data input, files/database structures, data transformation, algorithms, and data output by using appropriate computer language/tools to provide technical solutions for highly complex application development tasks

 

Morgan Stanley, Jersey City                    September 2022- August 2023

Morgan Stanley is an American multinational investment bank and financial services company headquartered at 1585 Broadway in Midtown Manhattan, New York City

 

Vice President SQL/.Net Dev

        On Prem SQL SERVER to SnowFlake Cloud Migration using Azure Data Factory, Terraform and SnowFlake Scripting.

        Design and develop Microsoft SQL database systems including stored procs, views functions, indexes.

·         Ensured performance, security and availabity of databases.

·         Led the development of common database procedures: upgrade, backup recovery, migration.

·         Collaborated with application developers to balance app and database needs, implemented best practices for performance tuning.

·         Debugged complex and inefficient sql code.

·         Worked with QA and other resources to address issues as they arose.

·         Provided leadership, direction and strategy to build, manage and scale a diverse software team.

·         Promoted best practices for software development, Dev-Ops, Agile adoption and practice.

·         Leader of onshore/offshore development teams.

·         Provided guidance, oversight and peer review to junior database developers.

·         SnowFlake Administration ; create roles and users.

·         Developed SnowFlake SQL Scripts to automate batch CSV files injestion to SnowFlake.

 

 

 

Johns Hopkins (TIC), Maryland              June 2021- September 2022

The Technology Innovation Center (TIC) is uniquely positioned within Johns Hopkins to help design, build, deploy and analyze novel enterprise, departmental, and clinical applications that strive to improve workflow, outcomes, and patient care.

 

Sr. Software Engineer – Precision Medicine

•     Design ETL architecture based from multiple sources including Enterprise EMR (Epic), researcher curated data, and other IT system to Databricks/ADF

•     Develop/change data input, files/database structures, data transformation, algorithms, and data output by using appropriate computer language/tools to provide technical solutions for highly complex application development tasks

·        Provide experienced leadership for strategic planning in designing and developing comprehensive innovative integrated solutions.

•    Developed SQL, PowerShell, Python scripts to automate & orchestrate ETL processes.

·         Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark. Data Ingestion to Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

·         Terraform Scripts for Snowflake Infrastructure Cloud Migration from SPLUNK, SQL SERVER and PostGres Databases.

·         SnowFlake Administration and Development.

·         Developed Powershell Scripts to automate CSV transformation to SnowFlake

·         Devloped Azure Data Factory pipelines for Snow Flake destination from a variety of sources; SAP,SPLUNK,SQL Server, CSV files.

·         Experience migrating/converting data from legacy applications to a new target application in Snowflake environment.

 

 

 

Unisys, Blue Bell, Pennsylvania             March 2020- January 2021

Unisys Corporation is an American global information technology company based in Blue Bell, Pennsylvania, that provides IT services, software, and technology.

Data Integration Engineer (Contract Role)

•             Design ETL architecture based on requirements from business analysts and report development teams.

•             Perform exploratory data analysis to understand gaps and issues in data sources.

•             Build ETL integrations to bring data into data warehouses (ServiceNow)

•             Cleanse and normalize data with master data management techniques

•             Developed SQL views, stored procs as a data source to feed POWER BI Dashboards.

•             Developed Power BI reports and dashboards from varied data sources including complex views and stored procs created in SQL Server.

•             Transform raw data into fact and dimension tables

•             Apply security best practices into data warehouse architecture

•             Developed SQL, PowerShell, Bash scripts to automate & orchestrate ETL processes to feed Power BI reports.

•             Documented the data integrations, transformations and schemas for steady state team

•             Assist with troubleshooting of ETL processes in Production

·              Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark. Data Ingestion to Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

·         Reversed engineered and developed C# scripts used with SSIS transformations in a.NET environment.

 

Ernst & Young, LLP Baltimore, MD                                                       January 2020- March 2020

EY provides advisory, assurance, tax and transaction services that help solve our client’s toughest challenges and build a better working world for all.

Manager - Data Modeler | Global Tax Platform Tax Services

·         Translate Business Requirements to Conceptual and Logical Data Models based on Global Tax Platform initiative

·         Recognize the need for a specific relational design – OLTP, OLAP, Data Warehouse – from a set of requirements and/or from a Logical Data Model

·         Participate in mapping of Logical to Physical Data Models and vice versa

·         Evaluate other Modeler’s logical and/or physical data models for adherence to modeling

·         Manage multiple assignments simultaneously, whether working independently or in teams

·         Identify, recommend and implement emerging Modeling and IT trends, tools and solutions

·         Review junior team members’ data models

·         Manage Azure Devops repos and reviewing and approving Pull requests branches.

·         Involved PI Planning using the Scaled Agile Framework (SAFe)

·         Migration of On Prem SQL Sever to Azure SQL (Cloud Migration)

 

 

 

 

United Health Group, Eden Prairie, MN                                                             June 2019- December 2019

The mission of the U.S. Securities and Exchange Commission is to protect investors, maintain fair, orderly, and efficient markets, and facilitate capital formation.

 

SQL Database Developer/ETL Developer (Contract Role)

·         Responsible for the End to End ETL process of SupLife Renewals Data to SQL Server by designing and developing SSIS Packages

·         Support translation of business requirements and rules to technical system design.

·         Responsible for Developing Complex Dynamic Stored Procs SQL Server on the back-end to Support PDF reports generation

·         Developed an Automation Job Solution of backing up scripts and data from SQL Server to share drives using PowerShell

·         Design ETL process and Created jobs using UNIX shell scripts (BASH), Python, CMD line, Talend and SSIS

·         Developed dashboards and ad-hoc reports using MS Power BI and SSRS for senior management team for analysis.

·         Implementation of text file transformations using SED, AWK, GREP and other complex Shell Commands

·         Develop T-SQL Code to fulfill Business and Technical requirements for new Initiatives

·         Develop complex stored procedures and functions to aid the overall workflow efficiency and data integrity.

·         Develop SQL Code to support Tableau Report Development

·         Transfer Data from SQL Server to Hadoop using SQOOP.

·         Design, develop, and execute packages to migrate client data to Hadoop

·         Create ad-hoc reports to users in Power BI by connecting various data sources.

·         Used excel sheet, flat files, CSV files, SQL Server views to generate Tableau ad-hoc reports.

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Involved in all aspects of front-end design. (Power BI and Custom java-based applications)

·         Code reviews and performance tuning.

·         Build and deploy database scripts and packages from Development àTestingàStaging àProduction.

 

Wells Fargo Technology, Charlotte, NC                                                                      July 2019 - November 2019

(Compliance and Reporting Data Analytics)

Wells Fargo Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure footprint; provides information security; and enables continuous banking access through in-store, online, ATM, and other channels to Wells Fargo’s more than 70 million global customers.

SQL DBA / BI Developer (Contract Role)

·         Responsible for the End to End ETL process of Compliance and Reporting Data Analytic Data to SQL Server by designing and developing SSIS Packages using .NET Framework (C#)

·         Responsible for Developing Complex Analytical Views in SQL Server on the back-end to Support Security Posture Insight (SPI) a front-End Application that provides information on security controls that measures how secure an application or its supporting infrastructure is by focusing on applications with the highest overall security exposure or weakest security posture.

·         Working extensively in Power BI report, creating tabular cube, creating data model for Power BI Report, Validating data quality of the report. 

·         Implementation of text file transformations using SED, AWK, GREP and other complex Shell Commands

·         Responsible for installing, configuring, and maintaining as per supported LOB applications.

·         Responsible for providing SQL Server problem resolution and 24x7 production support and server outages.

·         Responsible for disaster recovery and high availability solutions and large-scale database migrations from SQL server 2005 to SQL server 2008R2 and 2012.

·         Created and migrated partially contained databases within AlwaysOn Availability Groups.

·         Created automation of day to day operations, critical SQL server alerts and repetitive tasks.

·         Monitored the health and performance of SQL, server instances and home-grown applications using Extended Events, SQL server Profiler, PerfMon and other third-party tools.

·         Tested newly built servers for performance impact with DB load simulators.

·         Setup and maintained Log Shipping between various environments.

·         Development/Test/Production QA which included documentation of processes, creation of test scenarios and execution of tests, via T-SQL scripts, to ensure UI standards met the required specifications.

·         Created necessary T-SQL scripts, stored procedures, tables, views, functions, triggers and indexes.

·         Operated and oversaw database and log backups and restoration, planning backup strategies, and scheduling.

·         Backed up master & system databases and restoring them when necessary.

·         Optimized the Indexes, adding/deleting them when necessary for better performance.

·         Verified backups and error logs on the servers and troubleshoot any failures or alert problems, and open trouble tickets if necessary to work with technicians from other departments.

·         Responsible for Installing, configuring and maintaining SQL failover clustering for production environments.

·         Worked with installation, configuration and creation of SSRS reports.

·         Deciding whether clustering, log shipping, mirroring, replication, etc are the right fit to solve a business problemDevelop complex stored procedures and functions to aid the overall workflow efficiency and data integrity.

 

 

U.S. Securities and Exchange Commission, Washington DC                                      June 2017- June 2019

The mission of the U.S. Securities and Exchange Commission is to protect investors, maintain fair, orderly, and efficient markets, and facilitate capital formation.

Senior Consultant for XML/XBRL, SQL Data DBA/Database Developer/ETL Developer (Contract Role)

·         Responsible for supporting the complete SDLC for IDAP.  IDAP is a web application for production support of XBRL filing information, Audit Analytics (“AA”) and other data sets.

·         Responsible for the Data extract from WRDS by developing SAS programs

·         Responsible for the End to End ETL process of SAS data to SQL Server by developing SAS Programs and Shell Scripts to facilitate automation by initiating the process from a Linux Command Line.

·         Support translation of business requirements and rules to technical system design.

·         Responsible for Developing Complex Analytical Views in SQL Server on the back-end to Support Java Based APIs on the front-end

·         Maintain the Enterprise Data Warehouse and Data Marts for Office of Structured Data (OSD)

·         Design ETL process and Created jobs using UNIX shell scripts (BASH), Python, CMD line, Talend and SSIS

·         Implementation of text file transformations using SED, AWK, GREP and other complex Shell Commands

·         Analyze complex legacy code to improve and or make corrections for consistency and accuracy.

·         Build complex formulas by code to support business financial reporting

·         Develop and execute complex python and shell scripts, install and update real-time and continuous updates to Arelle and Arelle plugins, ongoing updates for Cygwin

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Analyze and perform data mapping which involves identifying source data field, identifying target entities and their lookup table ID’s, and translation rules.

·         Publish Workbooks by creating user filters so that only appropriate teams can view it.

·          Embedded Tableau views in to SharePoint and analyzed the source data and handled efficiently by modifying the data types.

·         Used excel sheet, flat files, CSV files, SQL Server views to generate Tableau and Power BI ad-hoc reports.

·         Generated tableau dashboards with combination charts for clear understanding.

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Involved in all aspects of front-end design.

·         Responsible for Deploying, Scheduling Jobs, Alerting and Maintaining SSIS packages and Updating

·         and maintaining Visual Source SAFe.

·         Build and deploy database scripts and packages from Development àTestingàStaging àProduction.

·         Configured and maintained transactional and Snapshot replication for 3 environments.

·         Worked with SQL Disaster Recovery processes and documented/revised those processes.

·          Worked with capacity planning, database sizing and monitoring.

·         Created databases, tables, indexes, stored procedures and triggers, alerts and maintained user permissions.

·         Involved in creating DTS Packages to import and export operations in and out of SQL Servers. 

·          Applied monthly patches and hot fixes. Worked with application team to test the successful upgrades.

·         Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.

·         Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and

·         Wrote few Java code to capture global map variables and use them in automated Talend jobs.

 

US Department of the Treasury

Office of the Comptroller of the Currency, Washington DC                                           July 2016- May 2017

The OCC charters, regulates, and supervises all national banks and federal savings associations as well as federal branches and agencies of foreign banks. The OCC is an independent bureau of the U.S. Department of the Treasury.

SQL Database Developer/ETL Developer/Tableau Developer (Contract Role)

·         Support translation of business requirements and rules to technical system design.

·         Maintain the Enterprise Data Warehouse and Data Marts.

·         Create new packages and Support existing production SSIS Packages

·         Develop T-SQL Code to fulfill Business and Technical requirements for new and existing systems.

·         Develop complex stored procedures and functions to aid the overall workflow efficiency and data integrity.

·         Analyze complex legacy code to improve and or make corrections for consistency and accuracy.

·         Build complex formulas by code to support business financial reporting

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Analyze and perform data mapping which involves identifying source data field, identifying target entities and their lookup table ID’s, and translation rules.

·         Support the development of a data mart within a data warehouse to support BI reporting.

·         Develop and deploy reports using SQL Server Reporting Services

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages to move the data from the Data Warehouse to Data mart.

·         Generated Dashboards with Quick filters, Parameters and sets to handle views more efficiently with Tableau 9.3

·         Publish Workbooks by creating user filters so that only appropriate teams can view it.

·          Embedded Tableau views in to SharePoint and analyzed the source data and handled efficiently by modifying the data types.

·         Used excel sheet, flat files, CSV files, SQL Server views to generate Tableau ad-hoc reports.

·         Generated Power BI dashboards with combination charts for clear understanding.

·         Support the development of a data mart within data warehouse to support BI reporting

·         Build and deploy database scripts and packages from Development àTestingàStaging àProduction.

 

US Department of Energy. Washington DC                                                                               Jan 2016- July 2016

The United States Department of Energy DOE is a Cabinet-level department of the United States Government concerned with the United States ‘policies regarding energy and safety in handling nuclear material and whose mission is to advance energy technology and promote related innovation in the United States.

SQL Database Architect /Developer (Contract Role)

·         Support translation of business requirements and rules to technical system design.

·         Support data migration and integration from Legacy Systems to new proprietary system via ETL/SSIS

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Support translation of business requirements and rules to technical system design.

·         Integrated data into Sales force applications.

·         Analyze and perform data mapping which involves identifying source data field, identifying target entities and their lookup table ID’s, and translation rules.

·         Develop a master inventory of all data fields to be captured in the new system, identifying data field type information, related system pages and requirements for data integration across business functions

·         Lead development of a logical and physical database model

·         Support the development of a data mart within a data warehouse to support BI reporting.

·         Develop and deploy reports using SQL Server Reporting Services

·         Support data migration and integration from Legacy Systems to new proprietary system via ETL/SSIS

·         Develop DTS, SSIS, BCP to import & export data from flat files, excel to MS SQL Server database.

·         Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages to move the data to the EPIC system.

                            

THE INTERSOCIETAL ACCREDITATION COMMISSION (IAC), Ellicott City, MD              Oct 2014- Jan 2016

The Intersocietal Accreditation Commission (IAC) provides accreditation programs for vascular testing, echocardiography, nuclear/PET, MRI, diagnostic CT, dental CT, carotid stenting and vein treatment and management. The IAC programs for accreditation are dedicated to ensuring quality patient care and promoting health care and all support one common mission: Improving health care through accreditation®.

SQL Database Developer

·         Designing and building of robust database structures that facilitates Web application development and Business Intelligence Reporting.

·         Work in collaboration with development leads and Product Development Manager to release regular and well-planned and executed upgrades to IAC systems using the AGILE/SAFe methodology

·         Design relational database structures in SQL Server that track data to the sixth normal form

·         Design custom database objects and methods for sharing data between disparate IAC databases and third-party providers by the use of XML, XSLT and SQL Server Service Broker

·         Troubleshoot and resolve database related issues.

·         Provide back-end support for online application and review system in SQL Server 2008 Standard and SQL Server 2012 Enterprise, making extensive use of advanced T-SQL and Xpath statements

·         Creates database applications that imports data (xml) and shreds into respective tables.

·         Continuous improvement to overall database performance; cleanup of stored procedures and functions.

·         Experienced and proficient in the design, construction and implementation of ETL

·         Created complex DTS Packages for Data Migration and Transformation from Oracle/Access/Excel Sheets using SSIS and Visual C#

·         Wrote robust coding for new and existing systems using C#, SQL in a .Net Environment.

 

Edgar Online, an RR Donnelley Company. Rockville, MD                                              October 2009 - Oct 2014

EDGAR® Online is a leader in the business information industry specializing in the extraction, packaging and distribution of public company information contained in SEC filings.

Senior XBRL Financial Developer and SQL Developer

·         Developed and deploy reports using SQL Server Reporting Services

·         Create and manage database objects such as stored procedures, functions, cursors, views and tables

·         Performed backup and restoration of databases

·         Developed SSIS packages to extract, transform and load data from various sources into databases

·         Identified and troubleshot database issues such as data types, data sizes, duplicates

·         Create a customized taxonomy document using a client's recent filing with the SEC as well as the US GAAP Taxonomy.

·         Ensure on time delivery of final approved XBRL (eXtensible Business Reporting Language) financial statements for customer.

·         Apply knowledge of current SEC requirements for evolving XBRL compliance requirements and address client questions related to their specific compliance requirements.

·         Managed the workflow process from Development to Quality Control to Final Review and Delivery to Client to subsequent Filing.

·         Worked in collaboration with Front End Developers to correct and trouble shoot any rendering issues.

·         Managed work flow processes off- shore team for the delivery of final XBRL packages.

·         Key Individual in the training of off shore teams in India and SriLanka.

·         Developed XBRL reports from HTML.

·         Mapping and tagging line items of financial statements namely; Balance Sheet, Cash Flow Statement, Income Statement, and Statement of Shareholder’s Equity using the latest U.S GAAP taxonomy for Creation of XBRL

·         Performance of Data Quality activities (Final Review) for translating financial statements to XBRL format.

·         Performed XBRL Taxonomy validations, negative value analysis and rendering prior to SEC filing.

·         Ensured that xbrl instance doc, Label Link base, Reference Link base, Calculation Link base, Definition Link base and Presentation Link base are all present prior to SEC Filing.

 

Freddie Mac, McLean, VA                                                                                              May 2008- August 2008

Freddie Mac's mission is to provide liquidity, stability and affordability to the housing market.

Operational Risk Management Reporting Engineer (Intern)

·         Performed key controls testing in compliance with Sarbanes Oxley (404,302) that resulted in successful compliance and dramatically reduced operational risk in the Multi- Family business segment.

·         Documented aspects of the control environment for the various business segments (Multi Family, Single Family), providing senior management and the Board with an independent assessment of operational risk.

 

Lehman Brothers Bank, Gaithersburg, MD                                                         October 2003- October 2006

Mortgage Data Analyst

·         Analyzed financial information including operating statements and net worth statements to determine appropriate loan size, terms, and pricing

·         Reviewed various reports including appraisals, environmental documentation, and engineering reports to determine compliance with investor requirements.

·         Performed Data Extraction and Analysis Utilized SQL Server applications/tools to query portfolio data and extract data as needed for reporting analysis/development or for customer delivery.

·         Analyzed data for reporting development as well as trend identification, event impact analysis, process measurement/improvement, and observation/summarization for senior management attention.

·         Conducted research and analysis and examined risk elements of portfolio or public data to present trends.