Figura professionale: Software Engineer

Nome Cognome: S. G.Età: 31
Cellulare/Telefono: Riservato!E-mail: Riservato!
CV Allegato: Riservato!Categoria CV: Business Intelligence / Data Scientist / DWH
Sede preferita: Wipro

Accesso Full al database con 29.989 CV a partire da € 5,00    ABBONATI SUBITO!



Sommario

Software Engineer

Esperienze

WORK EXPERIENCE
10/08/2015–05/02/2019 – Software Engineer
Wipro Technologies, Bengaluru (India)
· 3 and half years of professional experience in SDLC (Software Development Life Cycle) involving
analysis of user requirements, design analysis, implementation, debugging, testing and deployment.
· Working knowledge of ETL tools, Teradata, UNIX, Oracle, Mainframes, Control M scheduler.
· Completed Corporate training on HTML, Javascript and Core Java
· Comprehensive technical, oral, written and communicational skills
· Experience in providing ETL & process related trainings to the team newbies
· Independently perform complex troubleshooting, root-cause analysis and solution development.
· Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible
in work schedules and possess good communication skills.
· Team player, Motivated, able to grasp things quickly with analytical and problem-solving skills.

EDUCATION AND TRAINING
01/08/2011–01/06/2015 Bachelor of Engineering in Computer Science
JSS Academy of Technical Education, Bengaluru (India)
Related document(s): DegreeCertificate.pdf

10/08/2015–10/11/2015 Corporate Core Java training
Wipro Technologies, Bengaluru (India) 

Job-related skills 

Database :Oracle, Teradata
ETL Softwares: Informatica Power Centre, Datastage, Basics of Abinitio
Languages: PLSQL, UNIX shell programming, PHP
Schedulers: Control M, ESP
Other tools: SAP BO, Snowflake, AWS S3, WINSCP, Putty, Mainframes

ADDITIONAL INFORMATION
Projects Project #3
Duration: Feb 2018 – Feb 2019
Project Name: Teradata Lift ’N Shift
Technologies involved: Ab initio, Teradata, AWS, PHP and Control M
Team Size: 15
Client: Capital One, USA
Project Description: The Teradata Lift ‘N Shift project is aimed to achieve the migration of Capital
One Retail data from Teradata database to Snowflake database which is placed in AWS cloud
environment. Business logics are applied and developed with Ab initio tool to load into Teradata
Database.
Now the modification is to be applied to create the file which will be moved to AWS S3 environment
which is a storage area in cloud. From the S3, the data will be processed and loaded in the Snowflake
database which is a more adaptable database for the cloud environment by means of processing
Roles & Responsibilities
· Analyzing the business requirement and implementing rules in Ab initio and AWS.
· Identify the Nonpublic and Public data and applying the encryption process for Nonpublic data to
move to the cloud environment.
· Unit & Integration Testing of the code for the functionality.
· Fixing of code and data related bugs in the existing application.
· Process documentation and revision as and when required.
· Creating the control M draft to implement in all the environments.
· Modification of the web application to adapt to the new cloud database using php

Project #2
Duration: Nov 2016 – Jan 2018
Project Name: RMS Data Migration
Technologies involved: Informatica Powercentre 10.2, WINSCP, Putty, Toad, Mainframe,
ORACLE
Team Size: 9
Client: US Bank, USA
Project Description: Recovery Management had to recover money from retail users, consumer
banking and mortgage systems and the information related to it was disoriented and not available in
one common place. The aim of the project was to keep all the related data in one place with all the
logics applied. We had to do 2 types of data load i.e. full load and incremental load. Full load was 1-
time data load from old systems to the new system where as incremental load is daily, weekly,
monthly or yearly load as required. Informatica Powercentre was used an ETL tool to load data. We
also had to generate Audit report using Informatica Powercentre.

Roles & Responsibilities
· Analyzing and understanding the business requirement
· Wrote queries using complex conditions for the run SQL component used in the mappings
· Created Complex mappings using Unconnected and connected Lookup, Aggregator and Router
transformations etc. for populating target table in an efficient manner.
· Created User defined functions and used them in different Mappings
· Prepared Parameter files to keep the mappings dynamic
· Wrote shell scripts/Unix commands to move/merge files, update DB, send emails etc. that could be
ran pre/post session
· Developed Audit trail reports using informatica and shell scripting
· Prepared testcase documents and performed unit testing
· Developed Technical Approach Document which is deliverable to the Client having all the technical
details

Project #1
Duration: Aug 2015 – Oct 2016
Project Name: NewYork Pipeline Safety Compliance
Technologies involved: Datastage , SAP BO, Putty, WINSCP
Team Size: 6
Client: National Grid, USA
Project Description: NY Pipeline safety compliance project provided a reporting solution based on
the Business Objects platform that will allow National Grid to comply with NYS pipeline safety
regulatory requirements and a means by which NYS regulators and compliance users would access
pipeline safety Audit and compliance data respectively. This solution included extracting and
translating data from a variety of disparate data sources and storing this data into a DataMart that
supports the reporting solution.
The reposting solution enabled the National Grid Compliance Analysts to regularly conduct reviews of
pipeline safety compliance data and will enable timely and efficient reporting of pipeline safety
compliance data to NYS regulators during annual audit periods.
ETL Solution and NYS Data Mart acted as the Backend while SAP BO was used as the Front end.
Roles & Responsibilities
· Data Stage Jobs Development
· Unit & Integration Testing of the Jobs.
· Fixing of code and data related bugs in the application.
· Process documentation and revision as and when required.
· Provided Inputs for ESP scheduler
· Generated and tested Reports on SAP BO
· Onsite / Offshore Co-ordination for the implementation perspective

ExperienceLetter.pdf 

RELIEVING CUM SERVICE CERTIFICATE
Dear Satya Mounika G,
With reference to your letter of resignation, we hereby accept your resignation
from the services of the company.
Your service record is as follows:
You are relieved after the working hours on 05.02.2019 as per the terms of your
appointment.
Your accounts, if any, will be settled by our Accounts Department.
Name : Ms. Satya Mounika G
Designation : Software Engineer
Date of Joining : 10.08.2015
Date of Leaving : 05.02.2019
Reasons for Leaving : Resignation
We wish you all the best in your future endeavors.
Yours sincerely,
for Wipro Limited,

182 total views, 1 today