We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

0/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • Over 13+ years of experience in Datawarehouse life cycle implementations ( Analysis, Design, Testing and production ), Decision Support Systems, implementations of MDM concepts and also client server applications, and Big Data, Machine Learning and Predictive Models.
  • Expertise on business domains like Sales, IS Retail, Insurance, Finance, Health, Pharma & Telecom
  • Experienced in Relational Databases (RDBMS), Data Modeling and Design and build the Data Warehouse, Data marts using Star Schema and Snow Flake Schema for faster and effective Querying. Implemented Ralph Kimball methodology in designing DWH and Enterprise DWH systems. Used extensively Erwin & Microsoft Visio for data modeling tools.
  • Expert in implementing MDM ( Master Data Management ) Solutions and used various methodologies, and IDQ ( Data Quality ), Informatica Cloud Customer 360 (CC360), Metadata Manager, Informatica DVO, Power Exchange for SAP, Power Exchange for Lotus Notes
  • Have worked on Bigdata like Hadoop, Hive, Pig, Cassandra, etc, and also used informatica to load the data into Hadoop clustered environment
  • Have worked on SAP BODS ( Data Services ) for Data Migration / Conversion, and building other interfaces.
  • Expert on other ETL tools like Talend, Pentaho Data Integrator.
  • Expert in providing the solutions for ETL Design, Data Cleansing, Data conversion, batch management.
  • Expertise in Datawarehouse environment in various tools like,Extraction, Transformation and Loading data using Informatica Power Center 10.1/9.6.1,9.5.1, 9.1.0, 8.6.1, 8.5,8.1.1,7.1.3/6.1,6.2/5.1.2 (Designer, Workflow Manager, Workflow Monitor), and PowerConnect,Power Exchange,Data Quality,Data Explorer, and INFA MDM, Tera Data V2R5.
  • Expert in implementing integration solutions of ERP systems like SAP R/3 & SAP BI using Informatica power Connect / power Exchange as source and Target systems. Used extensively BAPI, IDocs, DMI & BCI to upload / download data to/from SAP systems. Also expert in LSMW tool to post the data into SAP R/3 and worked on SD & MM.
  • Expert on integrating the SCM products like DP, IO, MP, Servigistics etc
  • Expert in building the interfaces using Business objects Data Services.
  • Developed Integration layers between two ERP systems like SAP to i2, i2 to SAP, Peoplesoft to SAP & SAP to PeoplesSoft.
  • Implementation of the OBIEE/Siebel Analytics/ETL/Data warehouse project
  • Worked on different databases like Oracle, SQL Server 2005, Teradata, DB2 UDB, Sybase, My SQL and the file systems like Flat Files, XML XSLD, JSON/ SOAP Web services etc, Excel etc.
  • Expert in Performance of Mapping Optimizations in Informatica.
  • Experienced in Powercenter Change Data Capture (CDC) to perform databse changes and forward to powercenter.
  • Worked extensively on slowly changing dimentions like Type I & Type II
  • Having good experience with Mload, Tload, Fload & Fast export scripts using Teradata.
  • Automated and scheduled the Informatica jobs using UNIX shell scripting (CRONTAB) and AutoSys,Control M and Tivoli, Pearl.
  • Proficient with Perl Modules: DBI, Oracle & my SQL, Date, HTML, LWP, CGI, Object Oriented
  • Experience in writing stored procedures & functions (PL/SQL) and triggers using Oracle.
  • Worked on developing reports using Crystal Reports,Oracle Suite 10g.
  • Highly motivated team player with excellent presentation and interpersonal skills, always willing to work in challenging and cross - platform environment.
  • An organized self-starter requiring minimal supervision and possessing keen analytical, leadership, and communication skills

TECHNICAL SKILLS

ETL Tool: Informatica Powermart/PowerCenter 10.1, 9.5,9.1,8.6,8.5.1, 8.1.1/7.1.4/6. x/5.1Power Connect, Informatica Power Exchange, Informatica Data Qaulity ( IDQ ), Informatica MDM ( Master Data Management ), Informatica Cloud 360Business Objects Data Services 4.x, Informatica BDM, Power Exchange B2B,Informatica Cloud, ICERT Cloud technologies etc

Databases: Oracle 7.x/8.x/9i,10g, 11g, MS SQL Server, Sybase, DB2Teradata V14, 13,12

Database Utilities/ Tools: SQL Plus, SQL Loader, Toad, SQL Navigator, Rapid SQL, PLSQL Developer, SQL Advantage,Teradata V2R6

OLAP/DSS Tools: COGNOS (Impromptu6.1, Transformer and Power Play 6.1)Business Objects6.5.1 (Supervisor, Designer, Business Objects)

Languages/ Packages: C, C++, Java, Visual Basic, VB-Script, DHTML, HTML, Perl, SQL, PL/SQL, Shell Scripting (Csh, Ksh)

Operating Systems: UNIX, Windows XP /2000/NT /98 / 95

Data Modeling Tools: ERWIN 6.0

BigData: Hadoop, Pig, Hive, Cassandra, spark etc

PROFESSIONAL EXPERIENCE

Confidential

Sr ETL Developer

Responsibilities:

  • Worked with upstream teams in identifying impacted list of fields that are Logically/Structurally impacted or required conversion
  • Created new ETL interface to load data from Salesforce to non-salesforce and vice versa
  • Extracted documents ( pdf, xml, email etc ) from AWS using python
  • Supported business during critical times by providing data for adhoc requests
  • Worked closely with Data modelers and create standards, guidelines for data modeling and ETL processes
  • Reviewed and approved functional specification /Technical designs for all impacted ETL/New ETL’s and created UAT/SIT docs for the same after required changes are complete.
  • Build reports using tableau, and updated extracts on an ongoing basis. Also developed python scripts to automate the extracts
  • Have done data lineage, Business glossary, data profiling, data analysis, Scorecarding etc in data quality
  • Developed Business Services for Google API for tracking down the address in Informatica cloud
  • Developed and published web services using informatica ICERT ( using REST and XML )
  • Created Sales force lookup / merge transformation to lookup source data and delete duplicates.
  • Communicated to downstream systems about the impact analysis to handle the changes from reporting end
  • Developed & reviewed Informatica code done by team members and approved as per the standard guidelines document
  • Fine tuned ETL mappings with bottlenecks to improve the performance.
  • Held daily meetings with team members to resolve open issues/tickets, show stoppers to meet project plan deadlines.
  • Worked with the Business teams and various architects to understand the data model and the business processes
  • Designed the Informatica master data CC360 processes, CC - 360 Model objects, Beans and Master Beans etc.
  • Implemented the CC 360 product functionality to support MDM application design within SFDC
  • Lead offshore teams to design and develop/configure CC-360 functionality
  • Experience with technologies such as ETL, MDM, Data Integration and Data Modeling provided best practices for master data and CC-360 system design
  • Documented the design of complex Master Data Management solutions, including data modeling, source-to-target mappings, MDM architecture etc.
  • Experience with eliciting technical and design-related requirements
  • Strong experience in data profiling and data analysis required
  • Have developed scripts in Pythong for predictive data models and processing on AWS Server

Environment: Informatica CC360, Informatica 10.1, Informatica IDQ, Informatica Cloud, Salesforce,, Informatica Cloud, ICERT SQL Server on AWS Server, AWS, Python 3.6, Machine Learning, Analyst

Confidential, Houston, TX

Sr ETL Developer

Responsibilities:

  • Worked with upstream teams in identifying impacted list of fields that are Logically/Structurally impacted or required conversion
  • Created ripple query to identify the impacted fields across all ETL/DB objects and compared with manual analysis to identify the gaps
  • Created new ETL interface to load data from EDW to PCAAT database System
  • Worked closely with Data modelers and create standards, guidelines for data modeling and ETL processes
  • Reviewed and approved functional specification /Technical designs for all impacted ETL/New ETL’s and created UAT/SIT docs for the same after required changes are complete.
  • Involved in designing CDW (various levels of staging) from ultimate source to target and created standard guidelines for Informatica mappings
  • Extensively used Informatica metadata manager tool to identify the data flow using data lineages and created metadata manager services.
  • Extensively used Informatica Power Exchange for CDC.
  • Used data lineages from metadata manager for auditing the flow of data from source to target.
  • Extensively used Informatica metadata manager and business glossary to identify the impact analysis.
  • Involved in loading power center resources and relational DB schema to metadata repository and
  • Created Sales force lookup / merge transformation to lookup source data and delete duplicates.
  • Communicated to downstream systems about the impact analysis to handle the changes from reporting end
  • Extensively worked with offshore teams and evaluated the timely changes to the coding as part of conversion.
  • Developed & reviewed Informatica code done by team members and approved as per the standard guidelines document
  • Fine tuned ETL mappings with bottlenecks to improve the performance.
  • Held daily meetings with team members to resolve open issues/tickets, show stoppers to meet project plan deadlines.
  • Have loaded data into Hadoop 6 cluster node using informatica pwx for Hadoop, and also uploaded the data files from UNIX environment into Hadoop cluster node.
  • Worked with Informatica admin team in upgrading Informatica repository from 8.6 to 9.5 and tested objects after the up gradation.
  • Developed mapplets in DQ which runs on rule validations across source systems, and reports quality of the data, and imported mapplets into Power Center
  • Cleanse the data in DQ using address doctor for US, and Canada
  • Developed Webservices
  • Configured MDM hub, involved in data modeling, developed data mappings ( landing, staging, and base objects ) for pricing and control across trading commodities
  • Developed Data Validation, Match Merge rules, Customizing user exits, configuring BDD / IDD
  • Created MDM batch processes, SIF
  • Developed BPIM jobs over Informatica cloud and also integrated Salesforce and other internal applications for building BPIM
  • Used pearl to automate the batch jobs
  • Used pearl to connect to dbs to update the control tables, log tables etc, also file handling, files processing etc
  • Debugged the code in pearl, and converted the code in pearl uses for data processing into informatica ETL

Environment: Informatica 10.1, Informatica IDQ, Informatica Cloud, Salesforce, Informatica MDM 9.5, Oracle 11g, ECC 6, Pearl 5.2, Bigdata - Hadoop 2.X, Informatica BDM, power Exchange for B2B 10.1, Informatica Cloud, ICERT etc

Confidential, The Woodlands, TX

Sr ETL Developer

Responsibilities:

  • Execution and documentation for data process model and maturity as per the business functional design.
  • Designed and archiected Data conversion strategies, data integration, batch design etc and implemented
  • Generated Data Analysis reports, Quality reports using Informatica Data Exchange & Data Quality tools.
  • Proposed ETL functionality and architectures for different master data migration.
  • Extracting data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load through Informatica.
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica 8.1.
  • Created mappings Using Function Modules, BAPI, LSMW, IDOC’S.
  • Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
  • Implemented Webservices, XML interfaces using Business Objects Data Services
  • Implemented SAP MDM / Non SAP MDM solutions / interfaces, demonstrated solutions/interfaces
  • Developed the integration from i2 ERP to SAP and vice versa for SD Objects.
  • Loaded data into SAP 5.0 using IDOC’S, LSMW, BAPI and Function Modules.
  • Processing the IDOC’S manually in Sap 5.0.
  • Used BCI connect and extracted the data into SAP thru Informatica and oracle to load into systems like ANORM, AFECC, & MIDSTREAM applications.
  • Have supported the MM & SD Functional consultants in gathering the requirements and document them. Strong understing of SD & MM Modules.
  • Used different methods like BAPI, IDoc, Function Modules & DMI in informatica to post / extract the data from Legacy system to SAP system for the objects like Sales Orders, Returns, Customers, Materials and Vendors etc & also BW Cubes Cubes.
  • Developed ABAP Programs(reports) to upload the data/download the data.
  • Developed LSMW’s to migrate the processed data from Informatica to SAP R3 for objects like EDI Partner profiles, Text, Sales Orders, POs.
  • Delivering the ETL solutions on time as per business requirements.
  • Developed the rules in DQ, profiling at column, table and multiset, address validation using address doctor,
  • Developed mapplets in DQ which runs on rule validations across source systems, and reports quality of the data, and imported mapplets into Power Center
  • Cleanse the data in DQ using address doctor for US, and Canada
  • Developed Webservices
  • Configured MDM hub, involved in data modeling, developed data mappings ( landing, staging, and base objects ) for enterprise data like Customers, Materials, Vendors, routings, BOMs etc
  • Developed Data Validation, Match Merge rules, Customizing user exits, configuring BDD / IDD
  • Converted .NET script that pulls the data from Lotus notes into Informatica ETL

Environment: Informatica 10.1, Informatica IDQ, Informatica MDM 9.5, Oracle 11g, Informatica MDM, Informatica CDC, Windows 2003, SAP BODS 4.2, SAP ECC 6, BIW 7, UC4, Poewr Exchange for B2B 10.1

Confidential, San Diego CA

Sr ETL Developer

Responsibilities:

  • Execution and documentation for data process model and maturity as per the business functional design.
  • Designed and archiected Data conversion strategies, data integration, batch design etc and implemented
  • Generated Data Analysis reports, Quality reports using Informatica Data Exchange & Data Quality tools.
  • Proposed ETL functionality and architectures for different master data migration.
  • Extracting data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load through Informatica.
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica 8.1.
  • Created mappings Using Function Modules, BAPI, LSMW, IDOC’S.
  • Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
  • Implemented Webservices, XML interfaces using Business Objects Data Services
  • Implemented SAP MDM / Non SAP MDM solutions / interfaces, demonstrated solutions/interfaces
  • Developed the integration from i2 ERP to SAP and vice versa for SD Objects.
  • Loaded data into SAP 5.0 using IDOC’S, LSMW, BAPI and Function Modules.
  • Processing the IDOC’S manually in Sap 5.0.
  • Used BCI connect and extracted the data into SAP thru Informatica and oracle to load into systems like ANORM, AFECC, & MIDSTREAM applications.
  • Have supported the MM & SD Functional consultants in gathering the requirements and document them. Strong understing of SD & MM Modules.
  • Used different methods like BAPI, IDoc, Function Modules & DMI in informatica to post / extract the data from Legacy system to SAP system for the objects like Sales Orders, Returns, Customers, Materials and Vendors etc & also BW Cubes & SAP APO Cubes.
  • Developed ABAP Programs(reports) to upload the data/download the data.
  • Developed LSMW’s to migrate the processed data from Informatica to SAP R3 for objects like EDI Partner profiles, Text, Sales Orders, POs.
  • Delivering the ETL solutions on time as per business requirements.
  • Involved in meetings with business users to gather the requiments and understanding the business
  • Helping the business users to make up the source files & formats which would be easier to feed them to process and migrate the data
  • Implementing the proof of concept oriented solutions and present them to the client
  • Worked on performance improvement in informatica to load the data into SAP
  • Migrating the ETL objects from Dev server to Test server and from test server to UVT server and to the production system

Environment: Informatica 8.6.1/9.1.0 , Informatica IDQ, Oracle 11g, Informatica CDC, Windows 2003, JDA SCM Suite 6.1, SAP ECC 6, BIW 7, SAP BODS 4.0

Confidential, Findlay, OH

Sr ETL Developer

Responsibilities:

  • Heading the team in proposing the load strategies and preparing the architecture for handling the data objects & proposals made for the efforts required to build and load strategies.
  • Have worked on ZI2ROI, a repository for integration between SAP ECC and JDA planning systems
  • Have worked on functional requirements, created the functional design documents, approvals, etc
  • Designed and archiected Data conversion strategies, data integration, batch design etc and implemented
  • Have worked on designing the TDs, delivery schedule, unit testing, defect handling during test life cycles.
  • Developed Capability maturity model & Management process frame work.
  • Developed Quantitative maturity assessment & scoring model.
  • Execution and documentation for data process model and maturity as per the business functional design.
  • Generated Data Analysis reports, Quality reports using Informatica Data Exchange & Data Quality tools.
  • Proposed ETL functionality and architectures for different master data migration.
  • Extracting data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load through Informatica.
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica 9.1.
  • Created mappings Using Function Modules, BAPI, LSMW, IDOC’S.
  • Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
  • Build the interfaces for SAP MDM solutioning
  • Developed the integration from i2 ERP to SAP and vice versa for SD Objects.
  • Loaded data into SAP ECC 6 using IDOC’S, LSMW, BAPI and Function Modules.
  • Processing the IDOC’S manually in Sap ECC 6.
  • Used BCI connect and extracted the data into SAP thru Informatica and oracle to load into systems like ANORM, AFECC, & MIDSTREAM applications.
  • Have supported the MM & SD Functional consultants in gathering the requirements and document them. Strong understing of SD & MM Modules.
  • Used different methods like BAPI, IDoc, Function Modules & DMI in informatica to post / extract the data from Legacy system to SAP system for the objects like Sales Orders, Returns, Customers, Materials and Vendors etc & also BW Cubes & SAP APO Cubes.
  • Developed the field mappings, mappings, workflows on Business Objects Data Services.
  • Developed ABAP Programs(reports) to upload the data/download the data.
  • Developed LSMW’s to migrate the processed data from Informatica to SAP R3 for objects like EDI Partner profiles, Text, Sales Orders, POs.
  • Developed the rules in DQ, profiling at column, table and multiset, address validation using address doctor,
  • Developed mapplets in DQ which runs on rule validations across source systems, and reports quality of the data, and imported mapplets into Power Center
  • Cleanse the data in DQ using address doctor for US, and Canada
  • Developed Webservices
  • Configured MDM hub, involved in data modeling, developed data mappings ( landing, staging, and base objects ) for enterprise data like Customers, Materials, Vendors, routings, BOMs etc
  • Developed Data Validation, Match Merge rules, Customizing user exits, configuring BDD / IDD
  • Created MDM batch processes, SIF

Environment: Business Objects Data Services 4.x, Informatica 9.1.0, Informatica IDQ, Oracle 11g, Informatica CDC, Windows 2003, JDA SCM Suite 6.1, SAP ECC 6, BIW 7, SAP BODS 4.0

Confidential, Lewisville, TX

ETL Developer

Responsibilities:

  • Develop and deploy ETL job workflow with reliable error/exception handling and rollback framework.
  • Involved into design the logical data model from the technical design documents and translating to a physical data model using ERWIN/ERstudio.
  • Develop ODS logical and physical data models.
  • Extensively Used Maestro third party tool to schedule the Informatica jobs and for cleansing, and scrub data using trillium.
  • Participate in the business analysis sessions to obtain requirements for the data model.
  • Designed and architected Data conversion strategies, data integration, batch design etc and implemented
  • Extensively developed scripting for customized functions for Data Quality using DataFlux and also improved error handling functionality.
  • Extensive work on Data profiling, source systems and target systems, for data quality and better data loading.
  • Using CDC, data extraction takes place at the same time theinsert, update, ordelete operations occur in the source tables, and the change data is stored inside the database in change tables. The change data, thus captured, is then made available to the target systems in a controlled manner.
  • Security & User Management in Informatica environment. Managed multiple Informatica environments which involved maintaining users with specific roles and privileges on appropriate Folder permissions.
  • Actively involved in interaction with the Business users, Project Manage, Data Architect, DBA and UNIX admin for better management of production resources.
  • Involved into Creating the Java transformation to get the Objected data and transform to staging area.
  • Extensively used Xml source qualifier, Mid-stream XML parser transformation, Mid-stream XML generator transformations, as our main source type was XML Files.
  • Extensively used Dynamic Lookup transformation and Update Strategy transformation to update slowly changing dimensions.
  • Converted ETL Jobs designed in SSIS / Datastage into Informatica ETL
  • Sources Like DTS Packages (connections, tasks, and workflows) that can be used to access, transform and manipulate a wide range of sources including text files and relational databases and load into warehouse..
  • Worked on Version Control in Informatica to maintain multiple versions of an object, control
  • Created and used different tasks like Decision, Event Wait, Event Raise, Timer & E-mail
  • Unix Scripting and Scheduled PMCMD to interact with Informatica Server from command mode.
  • Created database Triggers, Stored Procedures, Exceptions and used Cursors to perform calculations when retrieving data from the database.
  • Used pearl scripting to automate the batch jobs

Environment: Informatica Power Center 9.1, 8.6.1,8.x,PowerExchange, IDQ Java,J2ee, Erwin 6.0, Maestro, Netezza, DB2UDB, DataFlux, Clearcase, SQL Server 2005, T-SQL, DTS Packages, TeradaV2R6, Bteq,SQL*Loader, XML, DB2UDB, ERWIN6.0, ERStudio, Cognos 8.3, Unix Solaris10, Pearl

Confidential, Houston

ETL Developer

Responsibilities:

  • Heading the team in proposing the load strategies and preparing the architecture for handling the data objects & proposals made for the efforts required to build and load strategies.
  • Developed Capability maturity model & Management process frame work.
  • Developed Quantitative maturity assessment & scoring model.
  • Execution and documentation for data process model and maturity as per the business functional design.
  • Designed and architected Data conversion strategies, data integration, batch design etc and implemented
  • Generated Data Analysis reports, Quality reports using Informatica Data Exchange & Data Quality tools.
  • Proposed ETL functionality and architectures for different master data migration.
  • Extracting data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load through Informatica.
  • Generated and Installed ABAP Program / SAP R/3 Code Using Informatica 8.1.
  • Created mappings Using Function Modules, BAPI, LSMW, IDOC’S.
  • Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
  • Implemented Webservices, XML interfaces using Business Objects Data Services
  • Implemented SAP MDM / Non SAP MDM solutions / interfaces, demonstrated solutions/interfaces
  • Developed the integration from i2 ERP to SAP and vice versa for SD Objects.
  • Loaded data into SAP 5.0 using IDOC’S, LSMW, BAPI and Function Modules.
  • Processing the IDOC’S manually in Sap 5.0.
  • Used BCI connect and extracted the data into SAP thru Informatica and oracle to load into systems like ANORM, AFECC, & MIDSTREAM applications.
  • Have supported the MM & SD Functional consultants in gathering the requirements and document them. Strong understing of SD & MM Modules.
  • Delivering the ETL solutions on time as per business requirements.
  • Involved in meetings with business users to gather the requiments and understanding the business
  • Helping the business users to make up the source files & formats which would be easier to feed them to process and migrate the data
  • Implementing the proof of concept oriented solutions and present them to the client
  • Worked on performance improvement in informatica to load the data into SAP
  • Migrating the ETL objects from Dev server to Test server and from test server to UVT server and to the production system.
  • Used pearl to automate the batch jobs

Environment: Informatica 8.6.1/9.1.0 , Informatica IDQ, Oracle 11g, Informatica CDC, Windows 2003, JDA SCM Suite 6.1, SAP ECC 6, BIW 7, Pearl

Confidential, CA

ETL Consultant

Responsibilities:

  • Worked with business analysts to identify appropriate sources for data warehouse and to document business needs for decision support for data.
  • Develop and deploy ETL job workflow with reliable error/exception handling and rollback framework.
  • Involved into design the logical data model from the technical design documents and translating to a physical data model using ERWIN/ERstudio.
  • Develop ODS logical and physical data models.
  • Extensively Used Maestro third party tool to schedule the Informatica jobs and for cleansing, and scrub data using trillium.
  • Participate in the business analysis sessions to obtain requirements for the data model.
  • In-depth knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases such as Requirements, Analysis/Design, Development and Testing.
  • Extensively developed scripting for customized functions for Data Quality using DataFlux and also improved error handling functionality.
  • Extensive work on Data profiling, source systems and target systems, for data quality and better data loading.
  • Interact with data modeling, data support teams to get the enterprise data for High level specs.
  • Involved into gathering the attributes from different existing models, and write high level specifications for every data requirement.
  • Liaise with the business users and business analysts to clarify requirements and walk through data models.
  • Done Installation & Configure the Informatica Version 8.6.1, Migrate the Informatica jobs from Older version to new Vrsion Using Version Control.
  • The Load Ready File is loaded into Teradata Table using Teradata’s ODBC/Bteq, Fast,Multi load connection. The Load Date and Load Time are also captured in the teradata table.
  • Using Pushdown Option and increase the system Performance..
  • Worked with CDC option, captures DB changes: forwarding them to power center for further processing.
  • Oracle CDC will only consume changes to base tables. Oracle Capture can capture SQL inserts, updates, and deletes made to the underlying base tables that support the materialized view, because that is what is logged to the Oracle redo log.
  • Using CDC, data extraction takes place at the same time theinsert, update, ordelete operations occur in the source tables, and the change data is stored inside the database in change tables. The change data, thus captured, is then made available to the target systems in a controlled manner.
  • Security & User Management in Informatica environment. Managed multiple Informatica environments which involved maintaining users with specific roles and privileges on appropriate Folder permissions.

Environment: Informatica Power Center 8.6.1,8.x,PowerExchange, Java,J2ee, Erwin 6.0, Maestro, Netezza, DB2UDB, DataFlux, Clearcase, SQL Server 2005, T-SQL, DTS Packages, TeradaV2R6, Bteq,SQL*Loader, XML, DB2UDB, ERWIN6.0, ERStudio, Cognos 8.3, Unix Solaris10.

Confidential

ETL Consultant

Responsibilities:

  • Heading the team in proposing the load strategies and preparing the architecture for handling the data objects & proposals made for the efforts required to build and load strategies.
  • Developed Capability maturity model & Management process frame work.
  • Developed Quantitative maturity assessment & scoring model.
  • Execution and documentation for data process model and maturity as per the business functional design.
  • Generated Data Analysis reports, Quality reports using Informatica Data Exchange & Data Quality tools.
  • Proposed ETL functionality and architectures for different master data migration.
  • Extracting data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load through Informatica.
  • Delivering the ETL solutions on time as per business requirements.
  • Involved in meetings with business users to gather the requiments and understanding the business
  • Helping the business users to make up the source files & formats which would be easier to feed them to process and migrate the data
  • Implementing the proof of concept oriented solutions and present them to the client
  • Worked on performance improvement in informatica to load the data into SAP
  • Migrating the ETL objects from Dev server to Test server and from test server to UVT server and to the production system.

Confidential, Irving, TX

ETL Consultant

Responsibilities:

  • Heading the team in proposing the load strategies and preparing the architecture for handling the data objects & proposals made for the efforts required to build and load strategies.
  • Worked on Customer Objects for cluster 1 along with manufacturing plants data like Glasgow.
  • Involved in Extraction, Transformation and Loading of data using Informatica.
  • Extracted data from SAP 4.6, DB2, Flat files and Loaded into SAP 5.0.
  • Created complex reusable transformations and mapplets using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer, and Mapplet Designer, respectively.
  • Created reusable Sessions and Commands in the Workflow Manager.
  • Created and implemented variables and parameters in the Mapping Designer.
  • Created reusable Worklets in Worklet Designer and set up Workflows in the Workflow Manager.
  • Extracting Incremental data, CDC, from Source Systems using Informatica Power Exchange.
  • Performance Bottlenecks to improve the performance of the sessions and workflows.
  • Written SQL overrides in source qualifier according to the business requirements.
  • Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
  • Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
  • Used Unix Scripting and Scheduled PMCMD to interact with Informatica Server from command mode.
  • Created database Triggers, Stored Procedures, Exceptions and used Cursors to perform calculations when retrieving data from the database.

Environment: Informatica 8.5, 8.6 Informatica: PowerConnect for SAP R3, SAP CRM; Power Exchange, Webservices, Data Stencil, Data Quality IDQ, Data Exchange IDE, Data Profiling. DB2 8.1, Toad 9.1, SAP R3 ECC 5.0 & Oracle 9i & 10g, UNIX 5.0 and Windows, Leech, Oracle client 10g, MS office 2003, Business Objects XI, SAP Netweaver, Siperian Master Reference Manager MRM, i2 DM, SCP, Oracle Data Integrator ODI

Confidential, MI

ETL Consultant

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation
  • Imported various heterogeneous and homogenous sources to load data to the different interfaces and Data warehouse using Informatica 7.1 based on the business requirement.
  • Created Complex mappings using Unconnected Lookup, Aggregate, Normalizer and Router transformations for populating target table in efficient manner.
  • Involved in developing and testing of ETL, Informatica mappings, workflows, sessions using Informatica.
  • Extracting the data from Peoplesoft 8, Flat files & Orcale for SD & FI Objects and load them through Informatica.
  • Involved in writing procedures, functions in PL/SQL.
  • Extensively worked on Siebel Analytics, SAP r/3 source system (ABAP Functional module)
  • Extensively worked on BAPI mappings and RFC enabled functional modules using Informatica Power connect adaptor
  • Used different methods like BAPI, IDoc, Function Modules & DMI in informatica to post the data from Legacy system to SAP system for the objects like Sales Orders, Returns, Customers, Materials and Vendors etc & also BIW Cubes & SAP BIW APO Cubes.
  • Developed ABAP Programs to upload the data.
  • Extensively used the transactions like BD87, SE10, SE11, SE16, SE37, SE38, WE 05, WE19, WE 20 Etc.
  • Developed LSMW’s to migrate the processed data from Informatica to SAP R3 for objects like EDI Partner profiles, Text, Sales Orders, POs.
  • Extensively Worked on UNIX.
  • Written Unix Shell Scripting based on requirements.
  • Designed sessions using Workflow Manager and monitored using Workflow Monitor
  • Scheduled jobs using Unix based on schedule criteria provided
  • Involved in Unit testing of Mappings and Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.

Environment: Informatica 7.1, Erwin 4.1, Oracle 9i, Ms Access, UNIX Shell Scripts, XML, SAP

Confidential

Responsibilities:

  • Involved in writing SQL Stored procedures to access data from Oracle 7.3.
  • Developed SQL queries to check the database.
  • Creation of Triggers, Stored Procedures, Tables, Indexes, Rules, Defaults etc.
  • Interaction with users and getting the enhancement requirements.
  • Implementation and maintenance of the product in client sites.
  • Performance tuning of SQL queries and Stored Procedures.
  • Involved in conducting the Functional, Regression and Integration testing.
  • Involved in stress and volume testing of the application. Developed Informatica objects for the countries like USA, NL, Australia, Poland & UK.
  • Preparation of Technical Design document, UTC, UTP, Test Data, Actual Results of different interfaces.
  • Interaction with the client & offshore team on various issues.
  • Preparation of various technical design documents and support spreadsheets.
  • Migrating ETL Objects from Development server to Test Server, Test Server to UVT Server, UVT Server to Production Server.
  • Extensive used Informatica to load source data from file systems to ODS and SAP BW, R3 & POSDM using Informatica Power Connect.

Environment: Informatica 7.1, Erwin 4.1, Oracle 9i, Ms Access, UNIX Shell Scripts, XML, SAP

We'd love your feedback!