Figura professionale: Software Developer, Scala, Python 3+
Nome Cognome | : K. R. | Età | : 36 |
---|---|---|---|
Cellulare/Telefono | : Riservato! | : Riservato! | |
CV Allegato | : Riservato! | Categoria CV | : Developer / Web dev. / Mobile dev. |
Sede preferita | : European |
Accesso Full al database con 29.999 CV a partire da € 5,00 ABBONATI SUBITO!
Sommario
Esperienze
PROFESSIONAL SUMMARY:
• 9+ years of extensive IT experience in Development, Support, Implementation of software applications which includes 4+ years of experience in Big data tools/utilities.
• ETL Experience in diversified fields in Data Warehousing and Worked on various projects involving Data warehousing using Informatica Power Center 8+/9+/10 from various source and target involving relational tables(ORACLE,SQL SERVER,TERADATA) & flat files.
• Good understanding of Hadoop Distributed File System (HDFS) and its commands.
• Understanding of MapReduce and Spark frame work.
• Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
• Experience in designing ETL frame work using Sqoop, HIVE.
• Hands on experience on major components of big data Hadoop Ecosystem like HDFS, HIVE, PIG, Sqoop, MongoDB( NoSql ), Flume, Spark, Oozie on Cloudera Dist.
• Experience in programming with Pyspark/Python and Spark (SCALA).
• Experience in AWS Services ( S3, EMR, LAMBDA,STEP, CLOUDWATCH ).
• POC done for KAFKA, DOCKER.
• Experience in UNIX Commands and UNIX Shell Scripting.
• Experience in PL/SQL Procedures, Functions, Triggers and Packages.
• Experience in Scheduling tool Appworx (Automatic Workload Automation Suite).
• Exp in Oracle Forms & Reports (D2k) ,COBOL programming with HTML and XML interactions.
• Experience working on Tableau.
• Agile and Waterfall Methodology.
• Excellent problem-solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.
• Strong Communication, Presentation and Analytical Skills. Able to learn new technologies and Business Terminology quickly.
• Application support after deployment and also for existing products services.
• Experience working as Team lead(L3), Coordinating with Business Stakeholders for Requirement gathering, Design, Allocating Tasks to resources ..
TECHNICAL SKILLS:
Big Data Ecosystems: Hadoop HDFS, Hive, Pig, Sqoop, Cloudera ,MongoDB, Spark , Flume, Oozie,YARN
Languages: PL/SQL, SQL, Unix Shell script, Python 3+, Scala
Database: Oracle, Sql-Server, Teradata, MongoDB
Scheduler: Appworx
Tools: Informatica, SQL Developer, Toad, Teradata Sql Assistant, Eclipse, Pycharm, Jupiter Notebook
Other Tools: GIT(Bitbucket), TortoiseSVN, Jira , WINSCP, Putty, PVCS (Version control Manager)
OS: Windows 7/ 8/10, Cent (Cloudera),Ubuntu
Cloud: AWS ( S3, LAMDA, STEPFUNCTION, EMR )
EXPERIENCE DETAILS:
Lead Software Engineer , IMPETUS, Gurgaon April 2019 – Present
Working for Marsh (Insurance) Client for Big Data Project.
Role:
• Building data Ingestion Pipeline to integrate data for Datalake.
• Loading of Data and Meta files in S3 via Informatica , then processing it on EMR Cluster
to main hive Layer warehouse.
• Creating catalog for auditing and meta file capture and ingesting it to MongoDB via
PySpark Jobs.
• Creating Lamda, Step Function for end to end creation of new functionality .
• Building Maven for releasing it to Higher Environment For deployment(Bitbucket) and validation post deployment.
ETL Developer ,SRS Business Solution, Pune Feb 2018 – March 2019
Working for SYMANTEC CORPORATION Client for Dataware House Project. Here Project type is Enhancement, Customization and Implementation of the business information flow for their different products into EDW.
Role:
• Design and implementation of building ETL using INFORMATICA.
• Loading of Data in Teradata via Informatica and BTEQ.
• Reading Data from ODS Oracle or Flat Files.
• Conducting Code Reviews and implementing best practices in project.
• Releasing it to Higher Environment For deployment.
Project Lead/Senior Software Engineer ,L&T Infotech, Pune Feb 2015-Feb-2018
Client :- Genuine Parts Company, is a U.S. Automotive Parts Group service organization engaged in the distribution of automotive replacement parts, industrial replacement parts, office products and electrical/electronic materials.
Role:
• As a part of Data ingestion team, creating Sqoop commands, Jobs to ingest data from various sources like Oracle, Sql-Server into HDFS and then import to HIVE.
• Automate with shell scripting and using Oozie for workflows.
• Analyze or transform stored data by writing Hive and Pig jobs based on business requirements.
• Transform Semi structured data (XML & JSON) into structured using HIVE.
• Bringing different Log files in HDFS using FLUME and then transform and process via Spark (Python/Scala).
• Client interaction on daily basis to understand their requirements.
• Participated in multiple big data POC to evaluate different architectures, tools and vendor products.
• Designed and developed mappings, workflows using INFORMATICA from varied transformation logics for extracting data from various sources involving relational tables(ORACLE,SQL SERVER,TERADATA) & flat files.
Application Engineer,Atos Wordline, Mumbai June 2012-Feb 2015
Atos Worldline is specialized in electronic payment services (issuing, acquiring, terminals, card and non card payment solutions & processing), eCS (eServices for customers, citizens and communities) as well as services for financial markets
Here Worked on e-payment services project for Axis, IDBI, JNK, Syndicate, SBI, YES bank, for the debit/credit card transaction details in Oracle D2k.
Role:
• Developed forms, reports, Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the functionality for various modules.
• Developed text reports using UTL FILE utility.
• Extract Source Information to DB layer using ETL INFORMATICA and SQL Loader.
• Maintaining existing functionalities as per the change in requirement, Preparing technical approach notes used in development
Assistant System Engineer ,TCS, Mumbai Aug 2010-Jun 2012
PROJECT NAME : State Bank of India – Core Banking Project.
TATA Consultancy Services (TCS) has been awarded the prestigious project for supplying a solution for the Core Banking Application System named B@ncs-24 to State Bank of India. This comprehensive solution ensures the realization of a fully functional Centralized Core banking solution for SBI. Currently over 13000 branches of SBI and its associates are on Core, making it the largest centralized banking solution in the world.
Role:
• Developed Cobol program to ingest input data from BANCS-24,via HTML and XML interactions.
• Developing new functionalities as per the requirement, Maintaining existing functionalities as per the change in requirement.
• Ad-Hoc Reports extractions using PL/SQL.
• Test case/data preparation, execution and verification of the test results, Deploying to Higher Environment.
• Involved in documenting user’s requirements, technical specifications and Reporting manuals.
EDUCATION:
Examination
Institute
Aggregate (%)
Year of Passing
B.Tech
(Computer science and Engineering)
College of Engg. and technology,
Bhubaneswar(Orissa)
80.1
2006-July 2010
HSC
M.G.M Higher secondary school,
Bokaro(Jharkhand)
83.4
March 2005
SSC
D.A.V Public School ,Bandhabahal
(ORISSA)
79.8
March 2003
EXTRAS:
• Received certificates at school at educational level for Scoring high.
• Got prizes and certificates for coming in top 3 in cultural festivals.
• Got prizes and certificates for drawing competition and quiz competition.
• Fitness , Outing , Movies , Music , Dance , Foodie.
PERSONAL DETAILS:
DOB: 19 September, 1988
Gender: Male
Marital Status: Single
Languages Known: English, Hindi
193 total views, 1 today