DOC

Teradata Utilities

By Lauren Kennedy,2014-06-08 19:14
10 views 0
Teradata Utilities

    Teradata Utilities

    Breaking the Barriers

    First Edition, October 2002

    Written by Tom Coffing, Michael J. Larkins, Randy Volters,

    Morgan Jones, Steve Wilmes

    Web Page: www.CoffingDW.com

    E-Mail address:

    Tom.Coffing@CoffingDW.Com

    Published by

Teradata, NCR, and BYNET are registered trademarks of NCR

    Corporation, Dayton, Ohio, U.S.A., IBM and DB2 are registered

    trademarks of IBM Corporation, ANSI is a registered trademark of

    the American National Standards Institute. In addition to these products names, all brands and product names in this document are registered names or trademarks of their respective holders.

    Coffing Data Warehousing shall have neither liability nor responsibility to any person or entity with respect to any loss or damages arising from the information contained in this book or from the use of programs or program segments that are included. The manual is not a publication of NCR Corporation, nor was it produced in conjunction with NCR Corporation.

    Copyright 2002 by Coffing Publishing

    All rights reserved. No part of this book shall be reproduced, stored in a retrieval system, or transmitted by any means, electronic, mechanical, photocopying, recording, or otherwise, without written permission from the publisher. No patent liability is assumed with respect to the use of information contained herein. Although every precaution has been taken in the preparation of this book, the publisher and author assume no responsibility for errors or omissions, neither is any liability assumed for damages resulting from the use of information contained herein. For information, address:

    Coffing Publishing

    7810 Kiester Rd.

    Middletown, OH 45042

    International Standard Book Number: ISBN 0-9704980-7-1

    Printed in the United States of America

    All terms mentioned in this book that are known to be trademarks or service have been stated. Coffing Publishing cannot attest to the

    accuracy of this information. Use of a term in this book should not be regarded as affecting the validity of any trademark or service mark.

An Introduction to the Teradata Utilities

    “It’s not the data load that breaks us down,

     it’s the way you carry it.”

     Tom Coffing

    Teradata has been doing data transfers to and from the largest data warehouses in the world for close to two decades. While other databases have allowed the loads to break them down, Teradata has continued to set the standards and break new barriers. The brilliance behind the Teradata load utilities is in their power and flexibility. With five great utilities Teradata allows you to pick the utility for the task at hand. This book is dedicated to explaining these utilities in a complete and easy manner. This book has been written by Five Teradata Certified Masters with experience at over 125 Teradata sites worldwide. Let our experience be your guide.

    The intent of this book is to twofold. The first is to help you write and use the various utilities. A large part of this is taken up with showing the commands and their functionality. In addition, it is showing examples using the various utility commands and SQL in conjunction with each other that you will come to appreciate.

    The second intention is to help you know which utility to use under a variety of conditions. You will learn that some of the utilities use very large blocks to transfer the data either to or from the Teradata Relational Database Management System (RDBMS). From this

    perspective, they provide a high degree of efficiency using a communications path of either the mainframe channel or network.

    The other approach to transferring data rows either to or from the Teradata RDBMS is a single row at a time. The following sections provide a high level introduction to the capabilities and considerations for both approaches. You can use this information to help decide which utilities are appropriate for your specific need.

    Considerations for using Block at a Time Utilities

    As mentioned above, there are efficiencies associated with using large blocks of data when transferring data between computers. So, the

    logic might indicate that it is always the best approach. However, there is never one best approach.

    You will learn that efficiency comes at the price of other database capabilities. For instance, when using large blocks to transfer and incorporate data into Teradata the following are not allowed:

     Secondary indices

     Triggers

     Referential integrity

     More than 15 concurrent utilities running at the same time

    Therefore, it is important to understand when and where these considerations are present. So, as important as it is to know the language of the utility and database, it is also important to understand when to use the appropriate utility. The capabilities and considerations are covered in conjunction with the commands.

    Considerations for using Row at a Time Utilities

    The opposite of sending a large block of rows at the same time is sending a single row at a time. The primary difference in these approaches is speed. It is always faster to send multiple rows in one operation instead of one row.

    If it is slower, why would anyone ever use this approach?

    The reason is that it provides more flexibility with fewer considerations. By this, we mean that the row at a time utilities allow the following:

     Secondary indices

     Triggers

     Referential integrity

     More than 15 concurrent utilities running at the same time

    As you can see, they allow all the things that the block utilities do not. With that in mind and for more information, continue reading about the individual utilities and open up a new world of capabilities in working with the Teradata RDBMS. Welcome to the world of the Teradata Utilities.

Chapter 2 BTEQ

    An Introduction to BTEQ

    “It’s not the data load that breaks us down,

     it’s the way you carry it.”

     Tom Coffing

    Why it is called BTEQ?

    Why is BTEQ available on every Teradata system ever built? Because the Batch TEradata Query (BTEQ) tool was the original way that SQL was submitted to Teradata as a means of getting an answer set in a desired format. This is the utility that I used for training at Wal*Mart, AT&T, Anthem Blue Cross and Blue Shield, and SouthWestern Bell back in the early 1990‘s. BTEQ is often referred to as the Basic

    TEradata Query and is still used today and continues to be an effective tool.

    Here is what is excellent about BTEQ:

     BTEQ can be used to submit SQL in either a batch or interactive

    environment. Interactive users can submit SQL and receive an

    answer set on the screen. Users can also submit BTEQ jobs from

    batch scripts, have error checking and conditional logic, and

    allow for the work to be done in the background.

     BTEQ outputs a report format, where Queryman outputs data in

    a format more like a spreadsheet. This allows BTEQ a great deal

    of flexibility in formatting data, creating headings, and utilizing

    Teradata extensions, such as WITH and WITH BY that

    Queryman has problems in handling.

     BTEQ is often used to submit SQL, but is also an excellent tool

    for importing and exporting data.

    ; Importing Data: Data can be read from a file on either

    a mainframe or LAN attached computer and used for

    substitution directly into any Teradata SQL using the

    INSERT, UPDATE or DELETE statements.

    ; Exporting Data: Data can be written to either a

    mainframe or LAN attached computer using a SELECT

    from Teradata. You can also pick the format you desire

    ranging from data files to printed reports to Excel

    formats.

    There are other utilities that are faster than BTEQ for importing or exporting data. We will talk about these in future chapters, but BTEQ is still used for smaller jobs.

    Logging on to BTEQ

    Before you can use BTEQ, you must have user access rights to the client system and privileges to the Teradata DBS. Normal system access privileges include a userid and a password. Some systems may also require additional user identification codes depending on company standards and operational procedures. Depending on the configuration of your Teradata DBS, you may need to include an account identifier (acctid) and/or a Teradata Director Program Identifier (TDPID).

Using BTEQ to Submit Queries

    Submitting SQL in BTEQ’s Interactive Mode

    Once you logon to Teradata through BTEQ, you are ready to run your

    queries. Teradata knows the SQL is finished when it finds a semi-colon,

    so don‘t forget to put one at the end of your query. Below is an example of a Teradata table to demonstrate BTEQ operations.

    Employee_Table

2000000 Jones Squiggy 32800.50 ?

    1256349 Harrison Herbert 54500.00 400

    1333454 Smith John 48000.00 200

    1121334 Strickling Cletus 54500.00 400

    Figure 2-1

    BTEQ execution

    .LOGON cdw/sql01; Type at command prompt:

    Logon with TDPID and

    USERNAME.

    Password: XXXXX Then enter PASSWORD at

    the second prompt.

    Enter your BTEQ/SQL Request or BTEQ Command. BTEQ will respond and is

    waiting for a command. SELECT * FROM Employee_Table An SQL Statement

    WHERE Dept_No = 400;

    *** Query Completed. 2 rows found. 5 Columns returned. BTEQ displays *** Total elapsed time was 1 second. information about the

    answer set.

    The result set

1256349 Harrison Herbert 54500.00 400

1121334 Strickling Cletus 54500.00 400

    Figure 2-2

    Submitting SQL in BTEQ’s Batch Mode

    On network-attached systems, BTEQ can also run in batch mode under UNIX (IBM AIX, Hewlett-Packard HP-UX, NCR MP-RAS, Sun

    Solaris), DOS, Macintosh, Microsoft Windows and OS/2 operating systems. To submit a job in Batch mode do the following:

    1. Invoke BTEQ

    2. Type in the input file name

    3. Type in the location and output file name.

    The following example shows how to invoke BTEQ from a DOS command. In order for this to work, the directory called Program Files\NCR\Teradata Client\bin must be established in the search path.

C:/> BTEQ < BatchScript.txt > Output.txt BTEQ is invoked and takes

    instructions from a file

    called BatchScript.txt. The

    output file is called

    Output.txt.

    Figure 2-3

    Notice that the BTEQ command is immediately followed by the

    execute. Then, the ‗>Output.txt‘ names the file where the output messages are written. Here is an example of the contents of BatchScript.txt file.

    BatchScript.txt File

    .LOGON CDW/sql00,whynot Logon statement onto

    Teradata. Notice the ―,‖

    before the password. This is

    in batch script format.

    The actual SQL SELECT * FROM Employee_Table

    WHERE Dept_No = 400;

    Logging off of Teradata .LOGOFF

Figure 2-4

    The above illustration shows how BTEQ can be manually invoked from a command prompt and displays how to specify the name and location of the batch script file to be executed.

    The previous examples show that when logging onto BTEQ in interactive mode, the user actually types in a logon string and then Teradata will prompt for a password. However, in batch mode, Teradata requires both a logon and password to be directly stored as part of the script.

    Since putting this sensitive information into a script is scary for security reasons, inserting the password directly into a script that is to be processed in batch mode may not be a good idea. It is generally recommended and a common practice to store the logon and password in a separate file that that can be secured. That way, it is not in the script for anyone to see.

    For example, the contents of a file called ―mylogon.txt‖ might be:

    .LOGON cdw/sql00,whynot.

    Then, the script should contain the following command instead of a .LOGON, as shown below and again in the following script: .RUN

    FILE=mylogon.txt

    This command opens and reads the file. It then executes every record in the file.

Using BTEQ Conditional Logic

    Below is a BTEQ batch script example. The initial steps of the script

    will establish the logon, the database, and the delete all the rows from

    the Employee_Table. If the table does not exist, the BTEQ conditional

    logic will instruct Teradata to create it. However, if the table already

    exists, then Teradata will move forward and insert data.

    Note: In script examples, the left panel contains BTEQ base commands and the right panel provides a brief description of each command.

.RUN FILE = mylogon.txt Logon to Teradata

    DATABASE SQL_Class; Make the default

    database SQL_Class

    DELETE FROM Employee_Table; Deletes all the records

    from the

    Employee_Table.

    .IF ERRORCODE = 0 THEN .GOTO INSEMPS BTEQ conditional logic

    that will check to /* ERRORCODE is a reserved word that contains

    ensure that the delete the outcome status for every SQL statement

    worked or if the table executed in BTEQ. A zero (0) indicates that

    even existed. statement worked. */

    If the table did not

    exist, then BTEQ will

    create it. If the table

    does exist, the Create

    table step will be

    skipped and directly

    GOTO INSEMPS.

.LABEL INSEMPS The Label INSEMPS

    provides code so the

    BTEQ Logic can go INSERT INTO Employee_Table (1232578, directly to inserting ‗Chambers‘ records into the ,‘Mandee‘, 48850.00, 100); Employee_Table. INSERT INTO Employee_Table (1256349,

    ‗Harrison‘

    ,‘Herbert‘, 54500.00, 400);

Report this document

For any questions or suggestions please email
cust-service@docsford.com