DOC

JAGANNATH REDDY POGULA

By Cindy Warren,2014-04-15 09:12
7 views 0
Designed ETL architecture to Process large no of files and created High-level design, low-level design documents. Extensively involved in data profiling

Subbarao

    .

Summary:

    ? Over all 7 Years of strong IT experience in Databases, Data Warehousing & Decision Support

    Systems.

    ? Strong experience in the Implementation of Extraction, Transformation & Loading (ETL) Life Cycle

    using Informatica Power Center/ Power Mart 8.1/7.1/6.x/5.1.

    ? Certified professional in informatica power center administrator and developer.

    ? Good understanding of the SQL and BTEQ in Teradata, Oracle, SQL server, MS access as RDBMS

    with Procedures, Functions and SQL * Plus.

    ? Extensively worked on performance tuning and error handling of BTEQ, MULTILOAD and

    FASTLOAD utilities in Teradata.

    ? Involved in the data analysis for source and target systems and good understanding of Data.

    ? Good understanding of warehousing concepts, staging tables, Dimensions, Facts and Star Schema.

    ? Experience in Application Design, Data Extraction, Data Acquisition, Data Modeling, Development,

    Implementations and Testing of Data warehousing and Database business systems.

    ? Heavily interacted with the Business Users to gather requirements, Design Tables, standardize

    interfaces for receiving data from multiple operational sources, coming up with load strategies of

    loading staging area and data marts. Dealt with Type1/Type2/Type3 loads.

    ? Good working knowledge in Data Modeling using Dimensional Data modeling, Star Schema/Snow

    flake schema, FACT & Dimensions tables, Physical & logical data modeling, ERWIN 4.x/3.x.

    ? Implemented Operational Data store framework between source systems and target warehouse.

    ? Experience in Integration of various data sources like Teradata, Oracle, SAP R3, SQL Server, Flat

    Files and COBOL files & XMLFiles.

    ? Worked on Cognos front-end reports using query studio & report studio.

    ? Used Cognos framework manger to edit metadata contents.

    ? Resolved product issues. Used features like Pre-Session/Post-Session commands/sql for fine-tuning of

    Databases/Data marts, Mappings, Sessions, Various Parameters in oracle and Informatica to get

    optimal performance.

    ? Exclusive knowledge in Identification of User requirements, System Design, writing Program

    specifications, Coding and implementation of the Systems.

    ? Hands on experience in scheduling Unix scripts, Teradata Bteq scripts, Informtica workflows and

    Cognos Cube builds using Redwood Cronacle

    ? Good knowledge on Teradata Macros, Bteq scripts & Utilities like Multi load, Fast load…etc.

    ? Extensively worked on creating, executing test procedures, test cases and test scripts using

    Manual/Automated methods.

    ? Highly motivated with the ability to work effectively in teams as well as independently.

Technical Skills:

    ETL Tools Informatica 8.1/7.1/6.x/5.1. OLAP & Reporting Cognos Reportnet 7 series, Business Objects 5,XI

    Data Modeling Erwin 4.0/4.5

    RDBMS Oracle 8.0/9i/10g, Teradata V2R5/V2R6, SQL Server 7.0/2000/2005, MS-Access

    Languages C, UNIX Shell Scripting.

    Operating system Windows 98/2000/NT/ME/CE/XP, UNIX, Linux, Sun Solaris.

    Methodologies Star Schema, Snowflake Schema

    Professional Experience:

GE Healthcare BI, Milwaukee, WI Dec 2006- till date

    Marketing Knowledge Database (MKD)

    The project is aimed at creating a central repository for the multiple external market research data sources (Flat Files both fixed width and delimited) and to provide reporting capability for the Global Marketing team

    to analyze GEHC’s market share, segments, and customer growth opportunities. The reports (Adhoc and

    Standard) are accessed through Business Object XI Reporting. The Operational Data Store framework is used

    to integrate the non-ERP customer data into Business Intelligence solutions. This enables the visibility of data

    across different marketing data sources for the same customers as well as existing finance data, if any.

    ? Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse

    design. Translated the user inputs into ETL design docs.

    ? Extensively involved in data modeling using ERWIN to design data warehouse.

    ? Designed ETL architecture to Process large no of files and created High-level design, low-level design

    documents.

    ? Extensively involved in data profiling using Informatica to analyze the source systems.

    ? Extensively worked on Informatica CDC logic to Process the delta data.

    ? Extensively worked on flat files and sources and involved in creation of UNIX shell scripts for FTP,

    generating list files for loading multiple files and in archiving the files after the completion of loads.

    ? Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files and loaded

    in to Stage area.

    ? Developed Complex Informatica Mappings, Tasks and Workflows using Workflow Manager to load

    Data Marts.

    ? Extensively worked on bteq scripts and shell scripts to move data from stage to target data warehouse.

    ? Implemented operational data store framework between sources to target warehouse.

    ? Implemented new mechanism of interface table, which will integrate various types of sources.

    ? Extensively worked on performance tuning of bteq scripts in teradata.

    ? Extensively worked on the error handling in bteq, mload and fast load.

    ? Involved in error handling, performance tuning of mappings, testing of Stored Procedures and

    Functions, Testing of Informatica Sessions, and the Target Data.

    ? Extensively used Aggregator, Filter, Joiner, Expression, Source Qualifier, Lookup, Rank, Router and

    Update Strategy transformations. Used debugger to test the mappings and fixed the bugs.

    ? Scheduled workflows, Bteq Scripts, Unix shell scripts using Redwood Cronacle.

    ? Carry out Defect Analysis and fixing of bugs raised by the Users

    ? Documented the existing mappings as per the design standards followed in the project.

    ? Performed Unit Testing at various stages by checking the data manually.

    ? Tuned performance of Mappings by requesting for new indexes and optimized the code.

    ? Highly motivated with the ability to work effectively in teams as well as independently.

    ? Sound theoretical and practical background in the Principles of Operating systems and Multi-threaded

    application.

    ? Carry out Defect Analysis and logging defects using Mercury Quality Center and clarify tool. Environment: Informatica Power Center8.1.0, 7.1.3, Teradata V2R6, Business Objects XI, UNIX, Toad

    and Windows XP

GE Health Care, Milwaukee, WI June 2005- Dec2006

Bi Production and Migration Support.

The project is having the jobs scheduled round the clock and the team will focus on Monitoring Jobs, case

    management, production fixes and reviews, root cause analysis of failures and administrative activities for the

    BI production environment. The team will work for migrations, scheduling and execution, of loads of BI

    related objects (Informatica, Cronacle and DPA) for GE Healthcare BI Division. According to GEHC process

    all the BI applicaton teams developed BI objects should first be migrated to QA Environment (staging area)

    and tested extensively in QA to ensure that objects are qualified for production migration to make it live.

    Responsibilities:

    ? Extensively used workflow monitor for monitoring the informatica jobs.

    ? Monitoring jobs and scheduling the jobs and bug fixing in production using Cronacle.

    ? Implemented event dependency between scheduled jobs in Cronacle.

    ? Carry out Defect Analysis and logging defects using Mercury Quality Center and clarify tool.

    ? Worked on reviews and deployment of code from QA environment to Production environment.

    ? Extensively worked on fixing the production data issues by testing in QA environment.

    ? Worked on performance tuning of long running jobs in informatica and teradata.

    ? Worked extensively on tuning the long running bteq scripts in production.

    ? Worked extensively on multiload and fastload utilities in teradata.

    ? Extensively tested the new code in dev and QA environment before moving to production.

    ? Involved in Administration activities using informatica.

    ? Involved in creating and managing folders in repositories using Repository Manager in Informatica.

    ? Involved in giving access to users and managing the privileges in informatica.

    ? Involved in starting the schedulers and services related to informatica and cronacle servers.

    ? Solved clarify and support central cases from users.

    ? Ensuring the completion of jobs and communicating with the users in case of delays and failures.

    ? Completely involved and alert during Nightly Process Monitor, Easily Solved variety of issues during

    Production Support, ensuring smooth completion of data loading process for Production System.

    ? Introduced new deployment strategies like using deployment groups for Informatica objects.

    ? Uploading all the Release related documents into CVS.

    ? Resolving and debugging the issues which arise during execution of the job chains in Redwood for

    environmental related issues and in Informatica repository for Informatica related issues

    ? Provided end user training and also worked on reviewing the Informatica Standard documents for

    smooth knowledge transition.

Environment: Informatica Power Center 8.1.0, 7.1.3, Cognos 7, Business Objects 5,Oracle 9i,

    TeradataV2R5, Erwin 4.5, Star Schema, PL/SQL, UNIX, Toad and Windows XP.

NJ HealthCare, New Jersey, USA Feb 2004 May 2005

NJ Health care is a middleware technology based project focusing on integration of various NJ Health Care

    Applications such as manufacturing products system, purchase order maintenance and sales work bench, etc.

    IBM Web Sphere Business Integration Server (Formerly CrossWorlds) and Informatica have been used as the

    enabling technology for Integration between heterogeneous systems based on Oracle Application, Mainframe,

    Flat File, Web, Siebel, and SQL Server, etc. In the view of NJ Medical System, NJHC is going to play an

    important role to enable smooth Integration of applications at both ends.

    Responsibilities:

    ? Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse

    design. Translated the user inputs into ETL design docs.

    ? Extensively involved in data profiling using Informatica to design the data warehouse.

    ? Created DDL and DML scripts for all target tables.

    ? Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files and loaded

    in to Data Warehouse.

    ? Extensively used the transformations Source Qualifier, EXP, Filter, AGG, LKP, UPD, Joiner, Sorter

    and Router to create mappings.

    ? Resolved design issues in transformations and mappings.

    ? Extensively worked on Informatica Power Center-Source analyzer, Data warehousing designer,

    Mapping Designer, Mapplet and Transformations to import source and target definitions into the

    repository and to build mappings.

    ? Tuned performance of Mappings by requesting for new indexes and optimized the code.

    ? Worked on Cognos front end reports using query studio, Report studio, Frame work manager

    ? Documented the existing mappings as per the design standards followed in the project.

    ? Performed Unit Testing at various stages by checking the data manually.

    ? Was always available & accessible to team members in understanding the warehouse and for technical

    issues.

    Environment: Informatica Power Center 7.1.1, Cognos 7, Oracle 9i, SQL, PL/SQL, Toad, Windows XP

    Professional, UNIX etc.

Quality Information Warehouse (QIW). Feb- 2003 to Jan-2004

    Solectron Corporation, CA

    Solectron Corporation is the largest electronics manufacturing Services Company. It provides a full range of

    manufacturing and integrated supply-chain services to its clients like HP, IBM, Apple, Cisco and Ericsson.

    Solectron builds electronic systems and subsystems for Computers, Consumer electronics, telecommunications

    and Chip manufacturing equipment. Its services include Product design, Prototyping, Repair and Distribution.

    The quality Information system is designed to analyze the failure of different products and keep track of

    repairs and replacements. The QIW extracts data from various source systems on a weekly basis using the ETL

    tool (Informatica). The scope involves analysis of failures, replacements and repairs of different products and

    the generation of reports on the basis of time, product, customer and region.

    Responsibilities:

    ? Prepared Transformation Specifications for each and every mapping.

    ? ETL Implementation for designing.

    ? Extensively used Informatica to load data from different data sources into the oracle data

    warehouse.

    ? Imported Sources and Targets to create Mappings and developed Transformations using Designer.

    ? Created mappings and transformations using joiner, expression, aggregator, filter, router, sorter,

    sequence generator, update strategy and lookup.

    ? Created the sessions and workflows.

    ? Monitoring the workflows using Workflow Monitor.

    ? Preparing test cases for unit test.

    ? Performed Unit testing with the UTP.

    Environment: Informatica Power Center 6.1.2, Oracle 8i, SQL, PL/SQL, Toad & UNIX.

Sales B.I Warehouse Analysis March -2002 to January 2003

    Philips, Salt lake City, US,

The application provides business intelligence analysis services to client sales department on cost margin

    performance metrics through interactive client tools. Data from various OLTP sources is selectively extracted,

    related and transformed and loaded into Oracle Warehouse.

    Responsibilities:

    ? Involved in the team meeting for understanding the business requirements.

    ? Involved in the creation of DDL and DML scripts for all stage and target tables.

    ? Developed the mappings for staging tables using Informatica to load data from different sources.

    ? Created transformations and Performed data cleansing using various features of Informatica.

    ? Worked with Informatica Designer to develop mappings using Source Qualifier, Joiner, Update

    Strategy, Lookup, Rank, Expression, Aggregator and Sequence Generator transformations.

    ? Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs

    of the transformations while loading the data.

    ? Created reusable transformations and used in different mappings.

    ? Created sessions and workflows using Workflow Manager.

    ? Performed Data Validation and Unit testing by preparing Unit test cases. Environment: Informatica Power Mart 5.1, Cognos 5, Oracle 8i, SQL, PL/SQL, Toad and Unix.

Report this document

For any questions or suggestions please email
cust-service@docsford.com