Informatica 10.2 Developer Mapping Guide (English) IN 102 En

User Manual:

Open the PDF directly: View PDF PDF.
Page Count: 276

DownloadInformatica - 10.2 Developer Mapping Guide (English) IN 102 En
Open PDF In BrowserView PDF
Informatica®
10.2

Developer Mapping Guide

Informatica Developer Mapping Guide
10.2
September 2017
© Copyright Informatica LLC 2014, 2017

This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.
Informatica, the Informatica logo, PowerCenter, and PowerExchange are trademarks or registered trademarks of Informatica LLC in the United States and many
jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://www.informatica.com/trademarks.html. Other company
and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights
reserved. Copyright © Sun Microsystems. All rights reserved. Copyright © RSA Security Inc. All Rights Reserved. Copyright © Ordinal Technology Corp. All rights
reserved. Copyright © Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright © Meta
Integration Technology, Inc. All rights reserved. Copyright © Intalio. All rights reserved. Copyright © Oracle. All rights reserved. Copyright © Adobe Systems Incorporated.
All rights reserved. Copyright © DataArt, Inc. All rights reserved. Copyright © ComponentSource. All rights reserved. Copyright © Microsoft Corporation. All rights
reserved. Copyright © Rogue Wave Software, Inc. All rights reserved. Copyright © Teradata Corporation. All rights reserved. Copyright © Yahoo! Inc. All rights reserved.
Copyright © Glyph & Cog, LLC. All rights reserved. Copyright © Thinkmap, Inc. All rights reserved. Copyright © Clearpace Software Limited. All rights reserved. Copyright
© Information Builders, Inc. All rights reserved. Copyright © OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo
Communications, Inc. All rights reserved. Copyright © International Organization for Standardization 1986. All rights reserved. Copyright © ej-technologies GmbH. All
rights reserved. Copyright © Jaspersoft Corporation. All rights reserved. Copyright © International Business Machines Corporation. All rights reserved. Copyright ©
yWorks GmbH. All rights reserved. Copyright © Lucent Technologies. All rights reserved. Copyright © University of Toronto. All rights reserved. Copyright © Daniel
Veillard. All rights reserved. Copyright © Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright © MicroQuill Software Publishing, Inc. All rights reserved.
Copyright © PassMark Software Pty Ltd. All rights reserved. Copyright © LogiXML, Inc. All rights reserved. Copyright © 2003-2010 Lorenzi Davide, All rights reserved.
Copyright © Red Hat, Inc. All rights reserved. Copyright © The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright © EMC
Corporation. All rights reserved. Copyright © Flexera Software. All rights reserved. Copyright © Jinfonet Software. All rights reserved. Copyright © Apple Inc. All rights
reserved. Copyright © Telerik Inc. All rights reserved. Copyright © BEA Systems. All rights reserved. Copyright © PDFlib GmbH. All rights reserved. Copyright ©
Orientation in Objects GmbH. All rights reserved. Copyright © Tanuki Software, Ltd. All rights reserved. Copyright © Ricebridge. All rights reserved. Copyright © Sencha,
Inc. All rights reserved. Copyright © Scalable Systems, Inc. All rights reserved. Copyright © jQWidgets. All rights reserved. Copyright © Tableau Software, Inc. All rights
reserved. Copyright© MaxMind, Inc. All Rights Reserved. Copyright © TMate Software s.r.o. All rights reserved. Copyright © MapR Technologies Inc. All rights reserved.
Copyright © Amazon Corporate LLC. All rights reserved. Copyright © Highsoft. All rights reserved. Copyright © Python Software Foundation. All rights reserved.
Copyright © BeOpen.com. All rights reserved. Copyright © CNRI. All rights reserved.
This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various
versions of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or
agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software
copyright © 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License
Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any
kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.
The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California,
Irvine, and Vanderbilt University, Copyright (©) 1993-2006, all rights reserved.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and
redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.
This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, . All Rights Reserved. Permissions and limitations regarding this
software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or
without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
The product includes software copyright 2001-2005 (©) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http://www.dom4j.org/ license.html.
The product includes software copyright © 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to
terms available at http://dojotoolkit.org/license.
This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations
regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.
This product includes software copyright © 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at
http:// www.gnu.org/software/ kawa/Software-License.html.
This product includes OSSP UUID software which is Copyright © 2002 Ralf S. Engelschall, Copyright © 2002 The OSSP Project Copyright © 2002 Cable & Wireless
Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.
This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software
are subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.
This product includes software copyright © 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at
http:// www.pcre.org/license.txt.
This product includes software copyright © 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/
release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/
license-agreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/
licence.html; http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/
Consortium/Legal/2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/
license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/
software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/
iodbc/wiki/iODBC/License; http://www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/
index.html; http://www.net-snmp.org/about/license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt;

http://www.schneier.com/blowfish.html; http://www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/
EaselJS/blob/master/src/easeljs/display/Bitmap.js; http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://
jdbc.postgresql.org/license.html; http://protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/
LICENSE; http://web.mit.edu/Kerberos/krb5-current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/
master/LICENSE; https://github.com/hjiang/jsonxx/blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/
LICENSE; http://one-jar.sourceforge.net/index.php?page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scalalang.org/license.html; https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/
intro.html; https://aws.amazon.com/asl/; https://github.com/twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/
LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and
Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary
Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://
opensource.org/licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/
licenses/artistic-license-1.0) and the Initial Developer’s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright © 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
See patents at https://www.informatica.com/legal/patents.html.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation
is subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES
OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH
OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
The information in this documentation is subject to change without notice. If you find any problems in this documentation, please report them to us in writing at
Informatica LLC 2100 Seaport Blvd. Redwood City, CA 94063.
Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.
Publication Date: 2017-09-08

Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Informatica Network. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Informatica Product Availability Matrixes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Chapter 1: Introduction to Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Mappings Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Mapping Components. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Data Object Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Segments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Copying a Segment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Mapping Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Connection Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Expression Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Object Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Mapping Run-time Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Validation Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Execution Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Reject File Directory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Maximum Parallelism. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Target Commit Interval. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Stop on Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Mapping Impersonation User Name. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Suggested Parallelism. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Hadoop Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Target Load Order Constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Constraints with Insert and Delete Rows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Target Load Order Rules and Guidelines. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Target Load Order Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
How to Develop a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Creating a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Adding Objects to a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4

Table of Contents

Connecting Mapping Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Creating Target Load Order Constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Validating a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Running a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Chapter 2: Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Mapplets Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Mapplet Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Mapplet Input and Output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Mapplet Input. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Mapplet Output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Generated Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Generated Mapplet Rules and Guidelines. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Generating a Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Rule Specifications and Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Rule Specification Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Creating a Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Mapplet Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Validating a Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Mapplet as a Rule Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

Chapter 3: Mapping Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Mapping Parameters Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
System Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
User-Defined Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Date/Time Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Where to Create User-Defined Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Where to Assign Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Parameters in Expressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Parameters in SQL Statements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Parameters in Custom Queries for Hive Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Parameters for Relational Table Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Parameters for Fields and Property Values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Parameters for Port Lists. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Parameters in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Parameter Instance Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Parameters in Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Parameter Instance Values in Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Mapplet Parameters in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Parameters in Mapplets Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Parameters in Logical Data Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Parameters in Virtual Table Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Parameter Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

Table of Contents

5

Run Mappings with Parameter Sets from infacmd. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Parameter Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Parameter File Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Project Element. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Application Element. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Rules and Guidelines for Parameter Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Sample Parameter File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Export a Parameter File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Creating a Parameter File from infacmd ms ListMappingParams . . . . . . . . . . . . . . . . . . . . 66
Running a Mapping with a Parameter File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
How to Configure Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Creating a Parameter for a Transformation Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Creating a Parameter in an Expression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Expose Transformation Parameters as Mapping Parameters. . . . . . . . . . . . . . . . . . . . . . . 72
Setting the Parameter Instance Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Creating a Parameter Set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

Chapter 4: Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Mapping Outputs Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
User-Defined Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Outputs View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Mapping Output Expression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
System-Defined Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Persisted Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Persisted Values Maintenance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Persisted Mapping Outputs and Deployment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Bind Mapping Outputs to Workflow Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Mapping Outputs In Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Bind Mapplet Outputs to Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Mapping Outputs in Logical Data Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
How to Configure Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Creating a Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Defining Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Configuring the Mapping Output Expression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Persisting Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Assigning Persisted Outputs to Mapping Task Input. . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Binding Mapping Outputs to Workflow Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
How to Bind Mapplet Outputs to Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Defining Mapplet Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Configuring a Mapping Output Expression in a Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . 99
Binding Outputs from a Mapplet to Mapping Outputs. . . . . . . . . . . . . . . . . . . . . . . . . . . 100

6

Table of Contents

Chapter 5: Generate a Mapping from an SQL Query. . . . . . . . . . . . . . . . . . . . . . . . . 102
Generate a Mapping from an SQL Query Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Example of Generated Mapping from an SQL Query. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
SQL Syntax to Generate a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Correlated Subqueries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Function Support in Queries that Generate a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Generate a Mapping from an SQL Query with an Unsupported Function. . . . . . . . . . . . . . . 104
INSERT, UPDATE and DELETE Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Rules and Guidelines for INSERT, UPDATE, and DELETE Statements. . . . . . . . . . . . . . . . . 105
Generating a Mapping or Logical Data Object from an SQL Query. . . . . . . . . . . . . . . . . . . . . . 106
Generate a Mapping from an SQL Statement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Create an SQL Statement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Paste or Import the SQL Statement to the Developer Tool. . . . . . . . . . . . . . . . . . . . . . . . 107
Complete Mapping Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108

Chapter 6: Dynamic Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Dynamic Mappings Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Dynamic Mapping Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
Dynamic Data Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
Dynamic Mapping Ports and Links. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Dynamic Mapping Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
Parameters in Dynamic Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
Dynamic Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Get Columns from the Data Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
Assign a Parameter to a Flat File Name. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
Assign a Parameter to Relational Source Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Assign a Parameter to the Source Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Dynamic Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Get Columns from the Data Source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Define Targets Based on the Mapping Flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Define Targets Based on the Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Create or Replace the Target at Run Time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Assign a Parameter to Relational Target Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Assign a Parameter to the Target Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Rules and Guidelines for Dynamic Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Dynamic Ports and Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Dynamic and Generated Port Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Rules and Guidelines for Dynamic and Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . 123
Dynamic Expressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Input Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Input Rule Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Include or Exclude Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

Table of Contents

7

Include All Remaining Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Rename Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Reorder Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Selection Rules and Port Selectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Port Selector Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Selection Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
Example - Selection Rules and Port Selectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Design-time Links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Link Resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Run-time Links. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Run-time Link Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Example - Run-time Links. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Troubleshooting Dynamic Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

Chapter 7: How to Develop and Run a Dynamic Mapping. . . . . . . . . . . . . . . . . . . . 142
Developing and Running Dynamic Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Configuring a Dynamic Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Using a Parameter as a Source for a Dynamic Mapping. . . . . . . . . . . . . . . . . . . . . . . . . 144
Configuring Sources to Get Metadata Changes at Run Time. . . . . . . . . . . . . . . . . . . . . . . 144
Creating a Dynamic Port. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Configuring Dynamic Ports Using Input Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
Step 1. Open the Input Rules Dialog box. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Step 2. Define Input Rules. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Step 2a. Choose the Operator and Selection Criteria. . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Step 2b. Configure the Name Selection Criteria Details. . . . . . . . . . . . . . . . . . . . . . . . . . 148
Step 2c. Configure the Type Selection Criteria Details. . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Step 2d. Configure the Pattern Selection Criteria Details. . . . . . . . . . . . . . . . . . . . . . . . . 149
Step 3. Rename the Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Step 4. Reorder the Generated Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Step 5. Verify the Dynamic Port Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Creating a Port Selector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Creating a Dynamic Expression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Configuring a Dynamic Target. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Using a Parameter as a Target for a Dynamic Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . 154
Getting Target Object Columns from the Data Source at Run-Time. . . . . . . . . . . . . . . . . . 155
Defining a DDL Query to Create or Replace the Target at Run Time. . . . . . . . . . . . . . . . . . 155
Defining Write Transformation Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Creating and Configuring a Run-time Link. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Validating and Running a Dynamic Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160

Chapter 8: Dynamic Mapping Use Cases. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Use Case: Dynamic Mapping for Metadata Changes in Relational Sources. . . . . . . . . . . . . . . . 161
Source Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161

8

Table of Contents

Target Table. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
Dynamic Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
Step 1. Configure the Read Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Step 2. Configure the Joiner Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
Step 3. Configure the Aggregator Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Step 4. Configure the Write Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Step 5. Create and Configure a Run-time Link. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Step 6. Validate and Run the Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Step 7. Run the Mapping after Changes to the Source Schema. . . . . . . . . . . . . . . . . . . . . 170
Use Case: Reuse Dynamic Mapping for Different Sources and Targets. . . . . . . . . . . . . . . . . . . 172
Source Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
Target Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
Dynamic Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
Step 1. Configure the Read_Customer_FF Read Transformation. . . . . . . . . . . . . . . . . . . . 175
Step 2. Configure the Exp_TRIM Expression Transformation. . . . . . . . . . . . . . . . . . . . . . 176
Step 3. Configure the Exp_Output Expression Transformation. . . . . . . . . . . . . . . . . . . . . 180
Step 4. Configure the Write_customerTrim_FF Write Transformation. . . . . . . . . . . . . . . . . 182
Step 5. Validate and Save the Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Step 6. Run the Dynamic Mapping Against Different Sources and Targets. . . . . . . . . . . . . . 184

Chapter 9: Mapping Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Mapping Administration Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Viewing Properties for a Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Viewing Summary Statistics for a Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Viewing Detailed Statistics for a Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Viewing Logs for a Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Reissuing a Deployed Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Canceling a Mapping Job. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Reject Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Location of Reject Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Content of Reject Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190

Chapter 10: Export to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Export to PowerCenter Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
PowerCenter Release Compatibility. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
Setting the Compatibility Level. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
Mapplet Export. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
Mappings with Parameters Export. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Export to PowerCenter Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Exporting an Object to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Export Restrictions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
Rules and Guidelines for Exporting to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Troubleshooting Exporting to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199

Table of Contents

9

Chapter 11: Import From PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Import from PowerCenter Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Override Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Conflict Resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
Import Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Data Type Conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Transformation Conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Transformation Property Restrictions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Parameter Conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
System Parameter Conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
PowerCenter Repository Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Connection Assignments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Importing an Object from PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
Import Restrictions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214
Import Performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216

Chapter 12: Performance Tuning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
Performance Tuning Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
Optimization Methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
Early Projection Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
Early Selection Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Branch Pruning Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Predicate Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Cost-Based Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
Dataship-Join Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
Semi-Join Optimization Method. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
Viewing an Optimized Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222
Optimizer Levels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222
Setting the Optimizer Level for a Developer Tool Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . 223
Setting the Optimizer Level for a Deployed Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224

Chapter 13: Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Pushdown Optimization Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Pushdown Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
Full Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
Source Pushdown. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Configuring Pushdown. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Transformation Pushdown Logic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
Pushdown Optimization to Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
Pushdown Optimization to Relational Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
Pushdown Optimization to Native Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Pushdown Optimization to PowerExchange Nonrelational Sources. . . . . . . . . . . . . . . . . . 231

10

Table of Contents

Pushdown Optimization to ODBC Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Pushdown Optimization to SAP Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
Pushdown Optimization Expressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
Operators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Comparing the Output of the Data Integration Service and Sources. . . . . . . . . . . . . . . . . . . . . 244

Chapter 14: Partitioned Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
Partitioned Mappings Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
One Thread for Each Pipeline Stage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Multiple Threads for Each Pipeline Stage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
Partitioned Flat File Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Concurrent Read Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Partitioned Relational Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Relational Connection Types for Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
SQL Queries for Partitioned Relational Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Rules and Guidelines for Relational Source Partitions. . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Partitioned Flat File Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Optimize Output File Directories for Partitioned File Targets. . . . . . . . . . . . . . . . . . . . . . 254
Merge Options for Partitioned File Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Commands for Partitioned File Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
Partitioned Relational Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Relational Connection Types for Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
Rules and Guidelines for Relational Target Partitions. . . . . . . . . . . . . . . . . . . . . . . . . . . 258
Partitioned Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
Restrictions for Partitioned Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Cache Partitioning for Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Disable Partitioning for a Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
Maintain Order in a Partitioned Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
Maintain a Stable Sort. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
Override the Maximum Parallelism for a Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
Suggested Parallelism for a Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
Execution Instances for Address Validator and Match Transformations. . . . . . . . . . . . . . . 264
Overriding the Maximum Parallelism Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
Troubleshooting Partitioned Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266

Chapter 15: Developer Tool Naming Conventions. . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Transformation Naming Conventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Object Type Naming Conventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
Workflow Object Naming Conventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269

Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271

Table of Contents

11

Preface
The Informatica Developer Mapping Guide contains information about how to create and use mappings and
mapplets, configure dynamic mappings, export and import objects, and to enable performance tuning and
pushdown optimization. The Informatica Developer Mapping Guide is written for developers who develop
mappings and mapplets, such as big data, data services, and data transformation developers. This guide
assumes that you have an understanding of flat file and relational database concepts and the database
engines in your environment.

Informatica Resources
Informatica Network
Informatica Network hosts Informatica Global Customer Support, the Informatica Knowledge Base, and other
product resources. To access Informatica Network, visit https://network.informatica.com.
As a member, you can:
•

Access all of your Informatica resources in one place.

•

Search the Knowledge Base for product resources, including documentation, FAQs, and best practices.

•

View product availability information.

•

Review your support cases.

•

Find your local Informatica User Group Network and collaborate with your peers.

Informatica Knowledge Base
Use the Informatica Knowledge Base to search Informatica Network for product resources such as
documentation, how-to articles, best practices, and PAMs.
To access the Knowledge Base, visit https://kb.informatica.com. If you have questions, comments, or ideas
about the Knowledge Base, contact the Informatica Knowledge Base team at
KB_Feedback@informatica.com.

Informatica Documentation
To get the latest documentation for your product, browse the Informatica Knowledge Base at
https://kb.informatica.com/_layouts/ProductDocumentation/Page/ProductDocumentSearch.aspx.
If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation
team through email at infa_documentation@informatica.com.

12

Informatica Product Availability Matrixes
Product Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types
of data sources and targets that a product release supports. If you are an Informatica Network member, you
can access PAMs at
https://network.informatica.com/community/informatica-network/product-availability-matrices.

Informatica Velocity
Informatica Velocity is a collection of tips and best practices developed by Informatica Professional
Services. Developed from the real-world experience of hundreds of data management projects, Informatica
Velocity represents the collective knowledge of our consultants who have worked with organizations from
around the world to plan, develop, deploy, and maintain successful data management solutions.
If you are an Informatica Network member, you can access Informatica Velocity resources at
http://velocity.informatica.com.
If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional
Services at ips@informatica.com.

Informatica Marketplace
The Informatica Marketplace is a forum where you can find solutions that augment, extend, or enhance your
Informatica implementations. By leveraging any of the hundreds of solutions from Informatica developers
and partners, you can improve your productivity and speed up time to implementation on your projects. You
can access Informatica Marketplace at https://marketplace.informatica.com.

Informatica Global Customer Support
You can contact a Global Support Center by telephone or through Online Support on Informatica Network.
To find your local Informatica Global Customer Support telephone number, visit the Informatica website at
the following link:
http://www.informatica.com/us/services-and-training/support-services/global-support-centers.
If you are an Informatica Network member, you can use Online Support at http://network.informatica.com.

Preface

13

Chapter 1

Introduction to Mappings
This chapter includes the following topics:
•

Mappings Overview, 14

•

Mapping Components, 15

•

Views, 21

•

Mapping Validation, 22

•

Mapping Run-time Properties, 23

•

Target Load Order Constraints, 27

•

How to Develop a Mapping, 30

Mappings Overview
A mapping is a set of input and output objects that represent the data flow between sources and targets.
They are linked by transformation objects that define the rules for data transformation. The Data Integration
Service uses the instructions configured in the mapping to read, transform, and write data.
You can run a mapping from a workflow so that you can run multiple mappings sequentially. Or, you can
develop a workflow that runs commands to perform steps before and after a mapping runs. You can include
a mapping with physical data objects as the input and output in a mapping task in a workflow.
The type of input and output object you include in a mapping determines the type of mapping. You can create
the following types of mappings in the Developer tool:
Logical data object mapping
Links a logical data object to one or more physical data objects. A logical data object mapping helps you
to integrate data from multiple sources and formats into a standardized view.
Operation mapping
Has an operation as the mapping input, output, or both. An operation mapping performs the web service
operation for the web service client.
Virtual table mapping
Has a virtual table as the mapping output. A virtual table mapping defines the virtual data flow between
sources and a virtual table in an SQL data service. Use a virtual table mapping to transform the data.

14

Dynamic mapping
A mapping in which you can change the sources, targets, and transformation logic at run time based on
parameters and rules that you define. Use a dynamic mapping to manage frequent schema or metadata
changes.
The following image shows an example of a mapping:

Mapping Components
Mapping components determine the data flow between sources and targets.
Every mapping must contain an input object, which reads data from a mapping component or file. Every
mapping must also contain an output object, which writes data to a mapping component or file.
A mapping can also contain the following components:
Data object operations
Repository objects that contain properties required to perform certain run-time operations on sources or
targets. Required for some PowerExchange adapter data sources.
Transformations
Modify data before writing it to targets. Use different transformation objects to perform different
functions.
Mapplets
Reusable objects containing a set of transformations that you can use in multiple mappings.
Segments
Consist of one or more objects in a mapping, mapplet, rule, or virtual stored procedure.

Data Object Operations
A data object operation is a repository object that contains properties required to perform certain run-time
operations on sources or targets. Some PowerExchange adapter data sources have a complex structure that
do not allow the Developer tool to import all properties that are necessary at mapping run-time.
For example, you import a PowerExchange Microsoft Dynamic CRM source, but it does not import with
precision and scale. In addition, you might need to perform specific run-time operations on Microsoft

Mapping Components

15

Dynamics CRM, or you might want to control the maximum number of records that the Data Integration
Service can read in one batch. You can configure these properties in the data object read operation.
When you import data from a PowerExchange adapter data source, the data stored in the source's complex
structure might also not be compatible with transformations in the Developer tool. You can use a data object
read operation to convert the data types native to the source into transformation data types that the
Developer tool can use in the mapping workflow.
When you write data to a target in a PowerExchange adapter data source with a complex structure, you might
need to similarly use a data object write operation to convert data from the transformation data types back to
the data types native to the data source.
If you import a physical data object from a resource, you can create data object operations for the physical
data object based on the same resource. The resource is the part of the data object from which you want to
read data. For example, the resource for a database can be a table or a view.
After you create the physical data object and the required data object operations, you can view the physical
data object in the object editor. The object editor view includes a tab where you can configure the data object
operation properties. A data object can have multiple read and write operations.

Data Object Read Operation
A data object read operation is associated with a source data object. You can create a data object read
operation and configure the data object read operation properties in the object editor.
When you view the data object read operation in the object editor, the object editor displays a source object
and an output object. The source and output objects comprise the data object read operation. You edit the
advanced and run-time properties in the output object.

16

Chapter 1: Introduction to Mappings

The following image shows an example of a data object read operation for a Teradata data object where the
source object is EMPLOYEE and the output object is Output:

After you create the data object read operation for the physical data object, you can create a Read
transformation to add the physical data object as a source in the mapping workflow. When you add the
physical data object to the mapping, you can specify the Read transformation and the data object read
operation that you want to use.
The following image shows the wizard that appears when you add the Teradata data object to the mapping:

Mapping Components

17

The Read transformation uses the properties that you configured in the output object of the data object read
operation.

Data Object Write Operation
A data object write operation is associated with a target data object. You can create a data object write
operation and configure the data object write operation properties in the object editor.
When you view the data object write operation in the object editor, the object editor displays an input object
and a target object. The input and target objects comprise the data object write operation. You edit the
advanced and run-time properties in the input object.
The following image shows an example of a data object write operation for a Teradata data object where the
input object is Input and the target object is EMPLOYEE_TGT:

After you create the data object write operation for the physical data object, you can create a Write
transformation to add the physical data object as a target in the mapping workflow. When you add the
physical data object to the mapping, you can specify the Write transformation and the data object write
operation that you want to use.

18

Chapter 1: Introduction to Mappings

The following image shows the wizard that appears when you add the Teradata data object to the mapping:

The Write transformation uses the properties that you configured in the input object of the data object write
operation.

Teradata Data Object Example
A Teradata data object represents metadata based on a Teradata resource.
When you configure a data object read operation for the Teradata data object and view it in the object editor,
you can see a source object and an output object. The output object displays the metadata from the Teradata
resource.
The following image shows a data object read operation for the Teradata data object:

The source object is tpt_src and the output object is Output. The source and output objects comprise the
data object read operation.
You can look at the metadata in the output object and see that the native data types in tpt_src are converted
into transformation data types in the output object. For example, the native data type VARCHAR for the name
port is converted to the transformation data type string.

Mapping Components

19

To use the Teradata data object in a mapping, you can create a Read transformation. When you add the Read
transformation to the mapping, you select the Teradata data object and the data object read operation that
you previously configured.
Because the Read transformation needs to access the transformation data types stored in the output object
of the data object read operation, the Read transformation is created based on the object operation, not the
Teradata data object. The Read transformation uses the same properties that you configured for the data
object read operation.
The following image highlights the Read transformation tpt_read in the mapping editor:

Notice that the metadata in the Read transformation and the output object of the data object read operation
are the same. For example, the metadata in the name port uses the transformation data type string instead of
the native data type VARCHAR.

Transformations
A transformation is an object that generates, modifies, or passes data.
Informatica Developer provides a set of transformations that perform specific functions. For example, an
Aggregator transformation performs calculations on groups of data. Transformations in a mapping represent
the operations that the Data Integration Service performs on the data. Data passes through transformation
ports that you link in a mapping or mapplet.
Transformations can be active or passive. Transformations can be connected to the data flow, or they can be
unconnected. For more information about transformations, see the Developer Transformation Guide.

Mapplets
A mapplet is a reusable object containing a set of transformations that you can use in multiple mappings.
When you use a mapplet in a mapping, you use an instance of the mapplet. Any change made to the mapplet
is inherited by all instances of the mapplet. Mapplets can contain other mapplets. You can also use a
mapplet more than one time in a mapping or mapplet. You can create a mapplet manually. You can also
generate a mapplet from a segment within a mapping or mapplet.
For more information about mapplets, see Chapter 2, “Mapplets” on page 35.

20

Chapter 1: Introduction to Mappings

Segments
A segment consists of one or more objects in a mapping, mapplet, rule, or virtual stored procedure. A
segment can include a source, target, transformation, or mapplet.
You can copy segments across folders or projects. Consider the following rules and guidelines when copying
segments:
•

The Developer tool reuses dependencies when possible. If it cannot reuse the dependencies, it copies
dependencies.

•

If a mapping, mapplet, rule, or virtual stored procedure includes parameters and you copy a
transformation that refers to the parameter, the transformation in the target object uses a default value
for the parameter.

•

You cannot copy input transformations and output transformations.

•

After you paste a segment, you cannot undo previous actions.

You can also generate a mapplet from a segment in a mapping or mapplet. You might want to generate a
mapplet when a mapping or mapplet contains a connected transformation flow that you want to reuse. For
more information about generating mapplets, see “Generating a Mapplet” on page 37.

Copying a Segment
You can copy a segment when you want to reuse a portion of the mapping logic in another mapping, mapplet,
rule, or virtual stored procedure.
1.

Open the object that contains the segment that you want to copy.

2.

Select a segment by highlighting each object you want to copy.
Hold down the Ctrl key to select multiple objects. You can also select segments by dragging the pointer
in a rectangle around objects in the editor.

3.

Click Edit > Copy to copy the segment to the clipboard.

4.

Open a target mapping, mapplet, rule, or virtual stored procedure.

5.

Click Edit > Paste.

Views
Different views become available when you click inside the editor. A view is a workbench part that can
navigate an information hierarchy or display object properties. You can switch between these views to modify
properties or other details in the editor. You can also use these views to select objects that you want to
appear in the editor.
The following image shows the different views in Informatica Developer:

Views

21

Switch between the following views to perform different tasks:
Properties
Configure general mapping properties, including mapping name, run-time properties, and load order
constraints.
Data Viewer
Preview the data and view the mapping output for each transformation. You can also export data in the
Data Viewer view.
Tags
Create a tag to add metadata, assign a tag to an object, and view all tags assigned to an object.
Checked Out Objects
View objects that you have checked out.
Notifications
Set up and configure global settings for scorecard notifications. You can also select recipients in the
Informatica domain to receive notifications during a workflow.

Mapping Validation
When you develop a mapping, you must configure it so that the Data Integration Service can read and
process the entire mapping. The Developer tool marks a mapping as not valid when it detects errors that will
prevent the Data Integration Service from running the mapping.
The Developer tool performs the following types of validation:

22

•

Connection

•

Expression

•

Object

Chapter 1: Introduction to Mappings

Connection Validation
The Developer tool performs connection validation each time you connect ports in a mapping and each time
you validate a mapping.
When you connect ports, the Developer tool verifies that you make valid connections. When you validate a
mapping, the Developer tool verifies that the connections are valid and that all required ports are connected.
The Developer tool performs the following connection validations:
•

At least one input object and one output object are connected.

•

At least one mapplet input port and output port is connected to the mapping.

•

Data types between ports are compatible. If you change a port data type to one that is incompatible with
the port it is connected to, the Developer tool generates an error and invalidates the mapping. However,
you can change the data type if it remains compatible with the connected ports, such as Char and
Varchar.

Expression Validation
You can validate an expression in a transformation while you are developing a mapping. If you do not correct
the errors, error messages appear in the Validation Log view when you validate the mapping.
If you delete input ports used in an expression, the Developer tool marks the mapping as not valid.

Object Validation
When you validate a mapping, the Developer tool verifies that the definitions of the independent objects, such
as Input transformations or mapplets, match the instance in the mapping.
If any object changes while you configure the mapping, the mapping might contain errors. If any object
changes while you are not configuring the mapping, the Developer tool tracks the effects of these changes on
the mappings.

Mapping Run-time Properties
The mapping run-time properties depend on the execution environment that you select for the mapping.
Configure the following mapping run-time properties:
•

Validation Environment

•

Execution Environment

•

Reject File Directory

•

Maximum Parallelism

•

Target Commit Interval

•

Stop On Errors

•

Mapping Impersonation User Name

•

Suggested Parallelism

•

Hive Connection

Mapping Run-time Properties

23

Validation Environment
The validation environment indicates whether the Developer tool validates the mapping definition for the
native execution environment, the Hadoop execution environment, or both. When you run a mapping in the
native environment, the Data Integration Service processes the mapping.
Based on your license, you can run a mapping in the Hadoop environment. When you run a mapping in the
Hadoop environment, the Data Integration Service pushes the mapping execution to the Hadoop cluster
through a Hadoop connection. The Hadoop cluster processes the mapping.
When you choose the Hadoop execution environment, you can select the Hive, Blaze, or Spark engine to
process the mapping. You can also select the Hive version. Select a version number from the list or assign a
parameter to the Hive version. Before you can assign it to the Hive version property, you must define a
parameter in the Parameters view. The parameter must be a string that contains a version from the Hive
Version list.
The following image shows the validation environment:

Choose both validation environments if you want to test the mapping in the native environment before you
run the mapping in the Hadoop environment. Or, choose both validation environments if you want to define
the execution environment value in a parameter when you run the mapping.
If you choose both environments, you must choose the execution environment for the mapping in the runtime properties.
Default is native.

Execution Environment
Select the execution environment to use when the mapping runs. When you run a mapping in the native
environment, the Data Integration Service processes the mapping. If you installed Big Data Management™,
you can run a mapping in the Hadoop environment. The Data Integration Service pushes the processing to
nodes on a Hadoop cluster. When you select the Hadoop environment, you can also select the engine to push
the mapping logic to the Hadoop cluster.

24

Chapter 1: Introduction to Mappings

You can use a mapping parameter to indicate the execution environment. Configure a string parameter. Set
the default value to "Native." When you select the execution environment, click Assign Parameter, and select
the parameter that you configured.
The following image shows where to select the mapping execution environment:

When you choose the execution environment, the Developer tool saves one of the associated validation
environments for the mapping run.

Reject File Directory
If you run mappings in the Hadoop environment, you can choose where to store the reject files if the Hadoop
connection is configured with a reject file directory. The Blaze engine can write reject files to the Hadoop
environment for flat file, HDFS, and Hive targets. The Spark and Hive engines can write reject files to the
Hadoop environment for flat file and HDFS targets.
You can write reject files to the Data Integration Service machine or to the Hadoop cluster. Or, you can defer
to the Hadoop connection configuration.
Choose one of the following options:
•

On the Data Integration Service machine. The Data Integration Service stores the reject files based on the
RejectDir system parameter.

•

On the Hadoop Cluster. The reject files are moved to the reject directory configured in the Hadoop
connection. If the directory is not configured, the mapping will fail.

•

Defer to the Hadoop Connection. The reject files are moved based on whether the reject directory is
enabled in the Hadoop connection properties. If the reject directory is enabled, the reject files are moved
to the reject directory configured in the Hadoop connection. Otherwise, the Data Integration Service stores
the reject files based on the RejectDir system parameter.

If you configure the mapping run-time properties to defer to the Hadoop connection, the reject files for all
mappings with this configuration are moved based on whether you choose to write reject files to Hadoop for
the active Hadoop connection. You do not need to change the mapping run-time properties manually to
change the reject file directory.
For example, if the reject files are currently moved to the Data Integration Service machine and you want to
move them to the directory configured in the Hadoop connection, edit the Hadoop connection properties to
write reject files to Hadoop. The reject files of all mappings that are configured to defer to the Hadoop
connection are moved to the configured directory.
You might also want to choose to defer to the Hadoop connection when the connection is parameterized to
alternate between multiple Hadoop connections. For example, the parameter might alternate between one
Hadoop connection that is configured to move reject files to the Data Integration Service machine and
another Hadoop connection that is configured to move reject files to the directory configured in the Hadoop
connection. If you choose to defer to the Hadoop connection, the reject files are moved depending on the
active Hadoop connection in the connection parameter.

Mapping Run-time Properties

25

Maximum Parallelism
Maximum parallelism is valid for the native execution environment. Maximum parallelism refers to the
maximum number of parallel threads that process a single mapping pipeline stage. An administrator sets
maximum parallelism for the Data Integration Service to a value greater than one to enable mapping
partitioning. The administrator sets the maximum parallelism in the Administrator tool.
The default maximum parallelism value for a mapping is Auto. Each mapping uses the maximum parallelism
value defined for the Data Integration Service. You can change the default maximum parallelism value to
define a maximum value for a particular mapping. When maximum parallelism is set to different integer
values for the Data Integration Service and the mapping, the Data Integration Service uses the minimum
value.
Default is Auto. Maximum is 64.
For more information about partitioning, see Chapter 14, “Partitioned Mappings” on page 246.

Target Commit Interval
The target commit interval refers to the number of rows that you want to use as a basis for a commit. The
Data Integration Service commits data based on the number of target rows that it processes and the
constraints on the target table. The Data Integration Service tunes the commit intervals. The default commit
interval is 10,000 rows.
The commit interval is an approximate interval for the Data Integration Service to issue the commit. The Data
Integration Service might issue a commit before, on, or after, the commit interval. In general, the Data
Integration Service checks the target commit interval after writing a complete writer buffer block.

Stop on Errors
This function stops the mapping if a nonfatal error occurs in the reader, writer, or transformation threads.
Default is disabled.
The following types of errors cause the mapping to stop when you enable Stop on Errors:
Reader errors
Errors encountered by the Data Integration Service while reading the source database or the source files.
Reader errors can include alignment errors while running a session in Unicode mode.
Writer errors
Errors encountered by the Data Integration Service while writing to the target database or to the target
files. Writer errors can include key constraint violations, loading nulls into a not null field, and database
trigger responses.
Transformation errors
Errors encountered by the Data Integration Service while transforming data. Transformation errors can
include conversion errors and any condition set up as an ERROR, such as null input.

Mapping Impersonation User Name
A mapping impersonation user name is valid for the native and Hadoop execution environment. Use mapping
impersonation to impersonate the Data Integration Service user that connects to Hive, HBase, or HDFS
sources and targets that use Kerberos authentication.
Enter a user name in the following format: /@

26

Chapter 1: Introduction to Mappings

Where:
•

Hadoop service name is the name of the Hadoop service that the Hive, HBase, or HDFS source or target
resides.

•

Host name is the name or IP address of the Hadoop service.

•

YOUR-REALM is the Kerberos realm.

You can only use the following special characters as delimiters: '/' and '@'

Suggested Parallelism
Suggested parallelism is valid for the native execution environment when the Maximum Parallelism property
is assigned to a value greater than one or to a parameter. Suggested number of parallel threads that process
the transformation pipeline stage.
When you define a suggested parallelism value for a transformation, the Data Integration Service considers
the value when it determines the optimal number of threads for that transformation pipeline stage. You might
want to define a suggested parallelism value to optimize performance for a transformation that contains
many ports or performs complicated calculations.
Default is Auto, which means that the transformation uses the maximum parallelism value defined for the
mapping. Maximum is 64.

Hadoop Connection
A Hadoop connection is valid for the Hadoop execution environment. A Hadoop connection defines the
connection information that the Data Integration Service requires to push the mapping execution to the
Hadoop cluster.
Select the Hadoop connection to run the mapping on the Hadoop cluster. You can assign a user-defined
parameter for the Hadoop Connection. Define the parameter on the Parameters view of the mapping.

Target Load Order Constraints
A target load order constraint restricts how the Data Integration Service loads and commits rows to two
target instances related to each other in the same mapping.
In the Developer tool you can configure constraints to restrict the order that the Data Integration Service
loads rows to target tables.
You can configure a constraint to force the Data Integration Service to load the data of a primary target
instance completely before loading data to a secondary target instance. The tables that you define as the
primary target and secondary target depend on the transactions in the input rows.
Consider the following scenarios for target load order constraints:
Insert rows to a master and a detail target.
You might configure a target load order constraint when you are inserting rows to targets that have a
primary key-foreign key relationship. Configure the target with the primary key as the primary target
instance. Configure the target with the foreign key as the secondary target instance. The Data
Integration Service can stage the data for the secondary target until it completes loading the primary
target.

Target Load Order Constraints

27

Delete rows from a master and a detail target.
When you need to delete rows from targets with a primary key-foreign key relationship, you configure a
different constraint. Configure the target with the foreign key as the primary target instance to delete the
rows from the detail target first. Configure the target with the primary key as the secondary target
instance.
Insert rows and update rows to the same relational table.
You can configure a target load order constraint for a mapping that loads insert rows and update rows to
a relational table from two separate transformations. Configure the constraint to restrict the Data
Integration Service from loading the update rows until after it loads the insert rows.
Target load order for flat files.
You can configure a target load order constraint for a mapping that loads rows into multiple flat file
targets. Configure the target load order to load the secondary flat file after the primary flat file.
You can configure multiple constraints in a mapping. The Data Integration Service determines the most
efficient execution plan to load the targets without violating the constraints.

Constraints with Insert and Delete Rows
Target load order constraints do not have special handling to process insert, update, and delete rows in the
same file.
When you need to process insert, update, and delete rows, you can configure a Router transformation to
return the insert and update rows to a different target instance than the delete rows. Configure target load
order constraints to specify the order in which to load the targets.
For example, you might have an Order_Header and an Order_Detail target. The Order_Detail table has an
OrderID foreign key to the Order_Header table. You need to process inserts, updates, and deletes in both
tables.
You can separate the insert and update rows from the delete rows using a Router transformation. You
configure the following output groups from the Router transformation:
1.

Order_Header insert and update rows

2.

Order_Header delete rows

3.

Order_Detail insert and update rows

4.

Order_Detail delete rows

You might create the following constraints for loading these rows to the targets:
Group #4 before group #2
Group #2 before group #1
Group #1 before group #3
These constraints force the Data Integration Service to process the deletes in the Order_Detail before the
deletes in the Order_Header. The Data Integration Service processes all the deletes before the insert and
update rows. It processes the Order_Header inserts and updates before the Order_Detail inserts and updates.

Target Load Order Rules and Guidelines
Consider the following rules and guidelines when you define target load order constraints:
•

28

In the Developer tool, you can configure some target columns as primary keys or foreign keys. Load order
constraints ignore these keys. If the targets have primary key-foreign key constraints, you must define the
load order constraints.

Chapter 1: Introduction to Mappings

•

The Developer tool does not validate the load order constraints as you define them. The Developer tool
validates the load order constraints as it validates the mapping.

•

The Data Integration Service can stage the data to a local disk for the second target instance in a target
load order constraint. When the mapping has multiple secondary target instances, the Data Integration
Service loads the staged data to the targets without violating the constraints.

•

The Data Integration Service loads one target instance and then another target instance without
determining whether the rows are inserts, deletes, or updates. For target tables with primary-key foreignkey constraints, an orphan row is a row in the foreign key target that does not have a matching row in the
primary key target. The Data Integration Service does not check for orphan rows. The Data Integration
Service loads all of the rows in the order you specify in the load order constraint.

WARNING: Informatica strongly advises against using the staging files or the data within the files.
Informatica is not responsible for corrupted data that is caused by customer alteration of the staging files or
the data in the tables. The structure of the staging files might change between Informatica versions.

Target Load Order Example
An organization processes customer orders twice a day. It receives the customer information and order
information in the same transaction file. The organization needs to ensure that the mapping that processes
the order file loads the customer information before it loads the orders.
A developer creates a mapping that returns the customer information to a Customer_Target table. The
mapping returns the orders to an Orders_Target table. The primary key of the Customer_Master is the
CustomerID. Each order in the Orders table has a foreign key to the CustomerID in the Customer_Master. The
developer creates a target load order constraint. The constraint restricts the Data Integration Service from
loading the orders until the it completes loading the customer information to the target.
The following image shows the mapping:

Target Load Order Constraints

29

A Normalizer transformation creates separate output groups for the customer and orders data. The
developer needs to ensure that the customer data loads to the target before the orders data.
The Data Integration Service can use different strategies to implement the target load order constraint. In this
example, the Data Integration Service creates two pipelines to load the customer data and the order data to
the target tables.
The following image shows the pipelines that load the customer data and the order data to the target tables:

In the first pipeline, the Data Integration Service loads the customers to the Customer_Master and it stages
the orders to a local disk file. In the second pipeline, the Data Integration Service loads the staged orders to
the Orders table.

How to Develop a Mapping
Develop a mapping to read, transform, and write data according to your business needs.
To develop a mapping and run a workflow, perform the following tasks:

30

1.

Determine the type of mapping that you want to create.

2.

Create input, output, and reusable objects that you want to use in the mapping. Create physical data
objects, logical data objects, or virtual tables to use as mapping input or output.

3.

Create reusable transformations that you want to use. If you want to use mapplets, you must create
them also.

4.

Create the mapping.

5.

Add objects to the mapping. You must add input and output objects to the mapping. Optionally, add
transformations and mapplets.

6.

Link ports between mapping objects to create a flow of data from sources to targets, through mapplets
and transformations that add, remove, or modify data along this flow.

7.

Validate the mapping to identify errors.

Chapter 1: Introduction to Mappings

8.

Save the mapping to the Model Repository.

9.

Run the workflow to see the mapping output.

Creating a Mapping
Create a mapping to move data between sources and targets and transform the data.
1.

Select a project or folder in the Object Explorer view.

2.

Click File > New > Mapping.

3.

Enter a mapping name.

4.

Click Finish.
An empty mapping appears in the editor.

Adding Objects to a Mapping
Add objects to a mapping to determine the data flow between sources and targets.
•

Drag a data object to the editor and select Read to add the data object as a source.

•

Drag a data object to the editor and select Write to add the data object as a target.

•

To add a Lookup transformation, drag a flat file data object, logical data object, reference table, or
relational data object to the editor and select Lookup.

•

To add a reusable transformation, drag the transformation from the Transformations folder in the Object
Explorer view to the editor.
Repeat this step for each reusable transformation that you want to add.

•

To add a non-reusable transformation, select the transformation on the Transformation palette and drag
it to the editor.
Repeat this step for each non-reusable transformation that you want to add.

•

Configure ports and properties for each non-reusable transformation.

•

Optionally, drag a mapplet to the editor.

How to Develop a Mapping

31

Connecting Mapping Objects
Complete a mapping by connecting the mapping objects. You connect mapping objects through the ports.
Data passes into and out of a transformation through input ports, output ports, and input/output ports.
When you add an object to a mapping, you connect the properties according to how you want the Data
Integration Service to transform the data. The editor displays mapping objects in the following ways:
•

Iconized. Shows an icon of the object with the object name.
The following image shows a mapping with iconized objects:

•

Normal. Shows the columns and the input and output port indicators. You can connect objects that are in
the normal view.
The following image shows the preceding iconized mapping in normal view:

When you link ports between input objects, transformations, mapplets, and output objects, you can create the
following types of links:
•

One to one links. Link one port in an input object to one port in an output object.

•

One to many links. Link one port to multiple output objects. You can also link multiple ports in one object
to multiple output objects.

You can manually link ports or link ports automatically:
•

Manually linking ports. You can manually link one port or multiple ports. Drag a port from an input object
to the port of an output object.

•

Automatically linking ports. When you link ports automatically, you can link by position or by name.

For more information about linking ports, see the Developer Transformation Guide.

Creating Target Load Order Constraints
Target load order constraints restrict the order that the Data Integration Service loads rows to target tables in
the same mapping. Create target load order constraints on the Load Order tab of a mapping.
1.

Click inside the editor.
The mapping Properties tabs appear in the bottom window.

32

Chapter 1: Introduction to Mappings

2.

Click the Load Order tab.
The following image shows the Load Order tab:

The image shows three constraints. Each constraint contains the primary target and the secondary
target. The constraints specify that the Data Integration Service must load the Customer_Master target
before the Customer_Orders target. The Sales_Rep_Master must load before the Customer_Orders
target. The Sales_Office_Master must load before the Sales_Rep_Master.
3.

To enter a constraint, click the New button.
The Developer tool creates a row for the constraint.

4.

Click the Primary field.
A list of the target instances in the mapping appears.

5.

Select the target instance that you want to load first.

6.

In the Secondary field, select the target instance to load second.
You can enter the constraints in any order. You could enter the following pairs of constraints to indicate
the same restrictions as in the previous image:

7.

Primary

Secondary

Sales_Office_Master

Sales_Rep_Master

Sales_Rep_Master

Customer_Master

Customer_Master

Customer_Orders

Enter as many constraints as you need.

Validating a Mapping
Validate a mapping to ensure that the Data Integration Service can read and process the entire mapping.
1.

Click Edit > Validate.
Errors appear in the Validation Log view.

2.

Fix errors and validate the mapping again.

How to Develop a Mapping

33

Running a Mapping
Run a mapping to move output from sources to targets and transform data.
If the domain includes more than one Data Integration Service and you have not selected a default service,
the Developer tool prompts you to select one when you preview data or run a mapping.
u

Right-click an empty area in the editor and click Run Mapping.
The Data Integration Service runs the mapping and writes the output to the target.
When the Data Integration Service is configured to use operating system profiles, it runs the mapping
with the operating system profile.

34

Chapter 1: Introduction to Mappings

Chapter 2

Mapplets
This chapter includes the following topics:
•

Mapplets Overview, 35

•

Mapplet Types, 35

•

Mapplet Input and Output, 36

•

Generated Mapplets, 37

•

Rule Specifications and Mapplets, 38

•

Creating a Mapplet, 40

•

Mapplet Validation, 40

Mapplets Overview
A mapplet is a reusable object containing a set of transformations that you can use in multiple mappings.
Use a mapplet in a mapping. Or, validate the mapplet as a rule.
Transformations in a mapplet can be reusable or non-reusable. If you add a Sequence Generator
transformation to a mapplet, it must be reusable.
When you use a mapplet in a mapping, you use an instance of the mapplet. Any change made to the mapplet
is inherited by all instances of the mapplet.
Mapplets can contain other mapplets. You can also use a mapplet more than once in a mapping or mapplet.
You cannot have circular nesting of mapplets. For example, if mapplet A contains mapplet B, mapplet B
cannot contain mapplet A.
You can create a mapplet manually. You can also generate a mapplet from a segment within a mapping or
mapplet.

Mapplet Types
The mapplet type is determined by the mapplet input and output.
You can create or generate the following types of mapplet:
•

Source. The mapplet contains a data source as input and an Output transformation as output.

•

Target. The mapplet contains an Input transformation as input and a data source as output.

35

•

Midstream. The mapplet contains an Input transformation and an Output transformation. It does not
contain a data source for input or output.

Mapplet Input and Output
To use a mapplet in a mapping, you must configure it for input and output.
A mapplet has the following input and output components:
•

Mapplet input. You can pass data into a mapplet from data sources or Input transformations or both. If
you validate the mapplet as a rule, you must pass data into the mapplet through an Input transformation.
When you use an Input transformation, you connect it to a source or upstream transformation in the
mapping.

•

Mapplet output. You can pass data out of a mapplet from data sources or Output transformations or both.
If you validate the mapplet as a rule, you must pass data from the mapplet through an Output
transformation. When you use an Output transformation, you connect it to a target or downstream
transformation in the mapping.

•

Mapplet ports. You can see mapplet ports in the mapping editor. Mapplet input ports and output ports
originate from Input transformations and Output transformations. They do not originate from data
sources.

Mapplet Input
Mapplet input can originate from a data source or from an Input transformation.
You can create multiple pipelines in a mapplet. Use multiple data sources or Input transformations. You can
also use a combination of data sources and Input transformations.
Use one or more data sources to provide source data in the mapplet. When you use the mapplet in a
mapping, it is the first object in the mapping pipeline and contains no input ports.
Use an Input transformation to receive input from the mapping. The Input transformation provides input ports
so you can pass data through the mapplet. Each port in the Input transformation connected to another
transformation in the mapplet becomes a mapplet input port. Input transformations can receive data from a
single active source. Unconnected ports do not appear in the mapping editor.
You can connect an Input transformation to multiple transformations in a mapplet. You can also connect one
port in an Input transformation to multiple transformations in the mapplet.

Mapplet Output
Use a data source as output when you want to create a target mapplet. Use an Output transformation in a
mapplet to pass data through the mapplet into a mapping.
Use one or more data sources to provide target data in the mapplet. When you use the mapplet in a mapping,
it is the last object in the mapping pipeline and contains no output ports.
Use an Output transformation to pass output to a downstream transformation or target in a mapping. Each
connected port in an Output transformation appears as a mapplet output port in a mapping. Each Output
transformation in a mapplet appears as an output group. An output group can pass data to multiple pipelines
in a mapping.

36

Chapter 2: Mapplets

Generated Mapplets
You can generate a mapplet from a segment in a mapping or mapplet. You might want to generate a mapplet
when a mapping or mapplet contains a connected transformation flow that you want to reuse.
The Developer tool validates the segment as a mapplet as part of the generation process. Review the rules
and guidelines for generated mapplets to avoid validation errors.

Generated Mapplet Rules and Guidelines
Mapplet generation fails if any of the following conditions are true:
•

The selected transformations are not in sequence.

•

The segment contains both Read and Write transformations. However, a segment can include multiple
Read or multiple Write transformations.

•

The segment includes nonreusable Sequence Generator transformations, Input transformations, Output
transformations, or transformations containing set operations.

•

The selected segment does not include all transformations in a pipeline branch.

•

The first and last transformations in a segment contain dynamic fields.

•

The segment includes incoming run-time links to the first transformation or outgoing run-time links from
the last transformation.

•

The segment consists of a single parameterized Read, Write, or Lookup transformation.

Generating a Mapplet
Generate a mapplet from a segment containing connected transformations. The segment can contain Read,
Write, or midstream transformations.
1.

Open the mapping or mapplet that contains the segment you want to generate into a mapplet.

2.

Select the transformations to include in the mapplet.

3.

Right-click one of the selected transformations, and select Extract Mapplet.
The following image shows a mapping with four transformations selected:

The generation process validates the segment and reports any validation errors.

Generated Mapplets

37

4.

Browse to the Mapplets node you want to generate the mapplet in.
The Developer tool generates the mapplet in the Mapplets node within the current project by default.

5.

Click Finish.
The mapplet replaces the transformations selected in the original mapping or mapplet. The tool adds
Input or Output transformations to the mapplet based on whether the segment contains Read, Write, or
midstream transformations.

The following image shows the selected transformations replaced by the mapplet:

Note that you must explicitly save the modified mapping or mapplet to replace the selected transformations
with the mapplet. To return the mapping or mapplet to its original state, select File > Undo three times.

Rule Specifications and Mapplets
A rule specification is a Model repository object that uses business logic to describe transformation
operations. Users create rule specifications in Informatica Analyst. You can add a rule specification to a
mapping in the same way that you add a mapplet to a mapping.
You can also add a rule specification to a mapplet and deploy a rule specification from the Developer tool as
a web service.
An Analyst tool user can generate one or more mapplets from a rule specification. Each mapplet contains
transformations that represent the rule specification logic. When you run a mapping that contains either the
rule specification or the corresponding mapplet, you get the same results.
You can edit a mapplet that a user generates from a rule specification in the same way as any mapplet that
you create in the Developer tool. You cannot edit a rule specification in the Developer tool. Add a rule
specification to a mapping when you want the mapping to apply the logic that the rule specification
represents. Add the corresponding mapplet to a mapping when you want to use or update the mapplet logic
independently of the rule specification.

38

Chapter 2: Mapplets

Rules and Guidelines for Rule Specifications
•

A rule specification contains a primary rule set and optionally contains additional rule sets. The primary
rule set represents the complete logic of the rule specification. Additional rule sets define discrete data
analysis operations and provide outputs that other rule sets can read.
The mapplet that represents the primary rule set has the same name as the rule specification.

•

If you rename a rule specification in the Developer tool, the Analyst tool displays the name when the user
opens the rule specification. If you rename the mapplet for the primary rule set, you do not change the rule
specification name.

•

If an Analyst tool user adds, deletes, or edits an input in a rule specification, the user breaks all input links
to other objects in a mapping. If an Analyst tool user adds, deletes, or edits an output in a rule
specification, the user breaks all input links to other objects in a mapping. The edits that break the links
include changes to the name, precision, or data type of an input or output. Update the links in any mapping
that uses the rule specification.
If an Analyst tool user updates the business logic in a rule specification but does not alter the inputs or
outputs, the input and output links do not break. The changes that the user makes are available to the
mapping when the user saves the rule specification.

Rule Specification Properties
A rule specification has properties that you can view and edit in the Developer tool. Use the properties to
review descriptive metadata for the rule specification name. You can also use the properties to determine the
number of output ports that the rule specification generates for downstream mapping objects.
To view the properties, open a mapping that contains the rule specification and select the rule specification
icon. Then, select the Properties tab in the mapping.
The tab displays the following views:
General
The general properties contain the name and description of the rule specification instance.
If you update the rule specification name or description in a mapping, the changes that you make apply
in the current mapping only.
Properties
The properties include the rule specification name that appears in the General view. The properties also
specify any date range that an Analyst tool user set for the rule specification. The date range indicates
the interval during which the rule specification is valid for use.
Note: You can select or clear an option to allow an output for each child rule in the rule specification. A
child rule is a rule set in the rule specification. When you select the option, the Developer tool adds an
output port for each rule set to the rule specification in the mapping. The option is clear by default.
Select the option to make available the rule set outputs to downstream objects in the mapping.
Ports
The port properties list the input and output ports on the rule specification instance. The port properties
display the name, data type, precision, and scale of each port. You can optionally add a description to a
port. The description applies to the port in the current rule specification instance.
Run-time Linking
The run-time linking properties determine now the rule specification ports link to other objects in a
dynamic mapping.

Rule Specifications and Mapplets

39

Advanced
The advanced properties include the tracing level setting. The tracing level defines the amount of detail
that appears in the log for the rule specification. You can choose terse, normal, verbose initialization, or
verbose data. The default value is normal.

Creating a Mapplet
Create a mapplet to define a reusable object containing a set of transformations that you can use in multiple
mappings.
1.

Select a project or folder in the Object Explorer view.

2.

Click File > New > Mapplet.

3.

Enter a mapplet name.

4.

Click Finish.
An empty mapplet appears in the editor.

5.

Add mapplet inputs, outputs, and transformations.

Mapplet Validation
You can validate a mapplet before you add it to a mapping. You can also validate a mapplet to use as a rule
in a profile.

Validating a Mapplet
Validate a mapplet before you add the mapplet to a mapping. You can also validate a mapplet as a rule to
include in a profile.
1.

Right-click the mapplet editor.

2.

Select Validate As > Mapplet or Validate As > Rule.
The Validation Log displays any errors that occur.

Mapplet as a Rule Validation
A rule is business logic that defines conditions applied to source data when you run a profile. It is a
midstream mapplet that you use in a profile. You can validate a mapplet that you want to use as a rule in a
profile.
A rule must meet the following requirements:

40

•

The rule must contain an Input and Output transformation. You cannot use data sources in a rule.

•

The rule can contain Expression transformations, Lookup transformations, and passive data quality
transformations. The rule cannot contain any other type of transformation. For example, a rule cannot
contain a Match transformation, as it is an active transformation.

•

The rule does not specify cardinality between input groups.

Chapter 2: Mapplets

Note: Rule functionality is not limited to profiling. You can add any mapplet that you validate as a rule to a
profile in the Analyst tool. For example, you can evaluate postal address data quality by selecting a rule
configured to validate postal addresses and adding it to a profile.

Mapplet Validation

41

Chapter 3

Mapping Parameters
This chapter includes the following topics:
•

Mapping Parameters Overview, 42

•

System Parameters, 43

•

User-Defined Parameters, 44

•

Where to Create User-Defined Parameters, 46

•

Where to Assign Parameters, 46

•

Parameters in Mappings, 55

•

Parameters in Mapplets, 57

•

Parameters in Logical Data Objects, 59

•

Parameters in Virtual Table Mappings, 60

•

Parameter Sets , 60

•

Parameter Files, 61

•

How to Configure Parameters, 66

Mapping Parameters Overview
A mapping parameter represents a constant value that you can change between mapping runs. Create
parameters to rerun a mapping with different values. Use parameters to change the values of connections,
file directories, expression components, port lists, port links, and task properties.
You can configure system parameters or user-defined parameters.
System parameters.
Built-in parameters for a Data Integration Service. System parameters define the directories where the
Data Integration Service stores log files, cache files, reject files, source files, target files, and temporary
files. An administrator defines the system parameter default values for a Data Integration Service in the
Administrator tool.
User-defined parameters.
Parameters that you define in transformations, logical data objects, mappings, and workflows. Create
user-defined parameters to rerun a mapping with different connection, flat file, cache file, temporary file,
expression, ports, or reference table values.

42

You can use parameters to determine which generated ports to use in a dynamic mapping at run time. You
can configure parameters to indicate which ports to link at run time. You can assign a parameter to change
the data object in a Read, a Write, or a Lookup transformation.
You can override parameter values by assigning a parameter set or a parameter file to a mapping. A
parameter set is a repository object that contains mapping parameter values. A parameter file is an XML file
that contains parameter values. When you run a mapping with a parameter set or a parameter file, the Data
Integration Service uses the parameter values defined in the parameter set or parameter file. These values
override the default parameter values you configured in the transformation, the mapping, the mapplet, or the
workflow.
For more information about workflow parameters, see the Informatica Developer Workflow Guide.

Related Topics:
•

“Parameters in Dynamic Mappings” on page 112

System Parameters
System parameters are constant values that define the directories where the Data Integration Service stores
cache files, reject files, source files, target files, log files, and temporary files.
Define the values of some of the system parameters in the execution options for the Data Integration Service.
An Administrator can update the values from the Administrator tool. The Data Integration Service determines
the values of other system parameters at run time. You cannot override system parameter values in a
parameter file or a parameter set.
You cannot create system parameters. The Developer tool provides a pre-defined list of system parameters
that you can assign to a data object or transformation in a mapping. For example, when you create an
Aggregator transformation, the cache directory system parameter is the default value assigned to the cache
directory field in Informatica Administrator. If you want to use a different cache directory location, create a
user-defined parameter and configure a default parameter value.
The Analyst tool displays the file path of system parameters in the following format: $$[Parameter Name]/
[Path]. For example, "$$SourceDir/ff_dept.txt."
The following table describes the system parameters:
System Parameter

Type

Description

CacheDir

String

Default directory for index and data cache files.

LogDir

String

Default directory for Mapping task log files.

RejectDir

String

Default directory for reject files.

SourceDir

String

Default directory for source files.

TargetDir

String

Default directory for target files.

TempDir

String

Default directory for temporary files.

ApplicationName

String

Name of the application

System Parameters

43

System Parameter

Type

Description

ExecutionEnvironment

String

Hadoop or Native environment.

MappingName

String

Name of the mapping that is running.

MappingRunStartTime

Date/time

The start time of the mapping that is running.

ServiceName

String

The Data Integration Service name.

UserName

String

Name of the user that is running the mapping.

User-Defined Parameters
User-defined parameters represent constant values that you can change between mapping runs.
For example, you create a mapping that processes customer orders. The mapping reads customer
information from a relational table that contains customer data for one country. You want to use the mapping
for customers in the United States, Canada, and Mexico. Create a user-defined parameter that represents the
connection to the customers table. Create three parameter sets that set the connection name to the U.S.
customers table, the Canadian customers table, and the Mexican customers table. Run the mapping with a
different parameter set for each mapping run.
You can create the following types of parameters:
Connection parameters
Informatica connection names.
Date/time parameters
Dates.
Expression
An expression that defines a join condition, a filter expression, or a lookup condition.
Input Linkset
A set of ports to link in the Run-time Linking dialog box.
Numeric parameters
Integer, bigint, decimal, and double parameters.
Port
Name of a single port. You can use the port parameter in the Rank port of the Rank transformation.
Port List
A list of ports to include a group. You can use a port list parameter in the Aggregator transformation or
the Rank transformation, for example.
Resource
The table, view, or synonym name of a relational data object. When the resource name is parameterized,
then the Data Integration Service uses the parameter value in the runtime query to fetch the object.

44

Chapter 3: Mapping Parameters

Sort List
A list of ports to sort with a Sorter transformation. The list includes the port name and an indicator for
ascending or descending sort sequence.
Sort Key List
A list of ports to sort with order keys in an Expression transformation configured for windowing. This list
includes the port name and an indicator for ascending or descending sort sequence.
String
String parameters represent flat file names, directories, table names or run-time properties. Define string
parameters with a precision of 32768 characters or less.
When you create a parameter, you cannot include a dollar sign ($) as the leading character in the parameter
name.
When you use a parameter to set a property value, you must use the correct parameter type for the property.
For example, you cannot use a connection type parameter for a target file name. You must use a numeric
parameter type if you are using the parameter in a numeric expression.
In relational data objects, you do not need to escape the dollar sign ($) in SQL overrides, filter conditions, join
conditions. The Data Integration Service treats a field that begins with a dollar sign in an SQL statement as a
parameter.
A parameter cannot contain a series of values. If you provide a series of values in a parameter, the Data
Integration Service treats the parameter values as a single string value.
For example, you have the parameters $IndexParameter1 (value 2) and $IndexParameter2 (value1, value2,
value3). You include these parameters in the expression INDEXOF as:
INDEXOF($IndexParameter1,'value1','value2','value3')
The Data Integration Service returns the value 0 instead of the value 2.

Date/Time Parameters
You can create date parameters and use the parameters in expressions.
You must define a date parameter in one of the following formats:
MM/DD/RR
MM/DD/YYYY
MM/DD/YYYY HH24:MI
MM/DD/RR HH24:MI
MM/DD/RR HH24:MI:SS
MM/DD/YYYY HH24:MI:SS
MM/DD/RR HH24:MI:SS.NS
MM/DD/YYYY HH24:MI:SS.NS

User-Defined Parameters

45

Where to Create User-Defined Parameters
You can create user-defined parameters in flat file data objects, transformations, custom data objects,
mapplets, mappings, and workflows. After you create the parameters, you can assign the parameters to
fields such as conditions, expressions, connections, directories, and file names.
When you create a parameter for a transformation, a logical data object, a mapplet, a mapping, or a workflow,
the parameter applies to that object. For example, you create a parameter in a transformation. Then you add
the transformation to a mapplet. You can either use the default parameter value from the transformation or
you can create a mapplet parameter to override the transformation parameter value.
To override the transformation parameter default value, bind the mapplet parameter to the transformation
parameter. Change the default value for the mapplet parameter. You can override the mapplet parameter with
a mapping parameter.
The following list shows where you can create parameters:
Workflow parameters
Mapping parameters
Mapplet parameters
Logical data objects
Transformation/data object parameters
You can set workflow parameter values and mapping parameter values at run time by configuring the
parameter values in a parameter set or a parameter file.
You can create parameters at the same time that you assign parameters to fields and properties. When you
assign a parameter to a field, you can create the parameter to use. You can also browse for a parameter that
you created previously.
Note: When you create parameters on the Parameters tab, do not include a leading dollar sign ($) in the
parameter name.
Maintain user-defined parameters on the Parameters tab of a transformation or of a data object. A mapping,
mapplet, workflow, or logical data object also has a Parameters tab. You can add, change, and delete
parameters on the Parameters tab.
You can also access parameters directly from the Outline view, which shows where parameters are used,
defined, and bound. When you click on a parameter in the Outline view, the parameter properties appear on
the Parameters tab.

Where to Assign Parameters
You can assign user-defined parameters and system parameters to fields. You must create the user-defined
parameters before you can assign them to fields.
You can parameterize some properties in objects and transformations. If you can assign a parameter to a
property, the option appears when you configure the property value.
You can parameterize properties of a reusable or nonreusable transformations. When you add a reusable
transformation to a mapping, you can override the default value with a mapping parameter. If you
parameterize a nonreusable transformation, the parameters are mapping parameters.
The Read and Write transformations are nonreusable transformations that you create from physical data
objects. You can parameterize the physical data objects that you create the transformations from. You can
also assign parameters to some properties in Read and Write transformations.

46

Chapter 3: Mapping Parameters

The following table lists the objects and the fields where you can assign parameters:
Object

Field

All transformations

Link resolution order

Association transformation

Cache file directory
Cache file size

Address Validator transformation

Casing style
Default country
Geocode data type
Global maximum field length
Line separator
Maximum result count
Optimization level
Standardize invalid addresses

Aggregator transformation

Cache directory
Expression elements. Not the full expression.
Group By

Bad Record Exception transformation

Lower Threshold
Upper Threshold

Case Converter transformation

Reference table.

Consolidation transformation

Cache file directory
Cache file size

Customized data object

Connection
Data object
Owner
SQL Query elements
Table name

Decision transformation

Decision script.

Duplicate Record Exception transformation

Cache file directory
Lower Threshold
Upper Threshold

Expression transformation

Expression elements. Not the full expression.
Port selector
Sort key list. Windowing only.

Filter transformation

Filter condition elements
Filter condition. Full expression.

Where to Assign Parameters

47

Object

Field

Flat file data object

Control file directory
Control file name
Connection name
Default scale
Flat file delimiter
Merge file directory
Source file directory
Source file name
Output file name
Output file directory
Reject file directory
Target directory

Joiner transformation

Cache directory
Join condition elements
Port selector

Key Generator transformation

Cache file directory
Cache file size

Labeler transformation

Reference table

Lookup transformation not including the
physical data objects for the lookup source

Data object. Nonreusable transformation.
Dynamic port rules. Nonreusable transformation.
Lookup condition. Full expression, nonreusable transformation.
Port selector. Nonreusable transformation.

Mapping

Hive version
Run-time environment
Maximum parallelism

Match transformation

Cache directory on the Match Output tab
Cache directory on the Match Type tab
Index directory on the Match Type tab
Persistence method
Threshold

Nonrelational data object

Connection

Rank transformation

Cache directory
Expression elements. Not the full expression.
Group by ports
Rank port

Read transformation

Connection
Data object
Owner name. Relational only.
Resource/table name. Relational only.

48

Chapter 3: Mapping Parameters

Object
Relational Data Object

Field
Filter condition elements
Join condition elements
PreSQL query elements
PostSQL query elements
SQL override elements

Router transformation

Group filter condition elements.
Group filter condition. Full expression.

Sorter transformation

Sort key
Group by
Work directory

SQL transformation

Connection

Standardizer transformation

Reference table

Token Parser transformation

Reference table

Update Strategy transformation

Update strategy expression elements.
Update strategy expression. Full expression.

Write transformation

Data object
Link resolution order
Reject directory
Reject file name

Parameters in Expressions
You can configure parameters in expressions or conditions in transformations, such as the Aggregator
transformation, Lookup transformation, Expression transformation, and the Filter transformation.
For example, configure a filter condition in the Filter transformation. Choose the ports and the parameters to
include in the condition. Select the system parameters or the user-defined parameters to include in the filter
condition.
The following image shows a Filter condition that includes the Employee port and the Employee_Filter
parameter:

Where to Assign Parameters

49

You can use parameters in expressions in the same arguments that accept port names as arguments. You
cannot use a parameter to replace a constant argument in an expression.
For example, consider the TO_DECIMAL expression that converts a string to a decimal value:
TO_DECIMAL( value [, scale] )
The scale argument must be a constant value in the expression.
The following valid expression contains a constant argument for scale:
TO_DECIMAL( Input_Port,10 )
The following expression is not valid because it contains a user-defined parameter for the scale argument :
TO_DECIMAL( Input_Port,$Scale_Param )
A parameter cannot contain another parameter. For example, if you configure Parameter1 and Parameter2 in
a transformation, you cannot set the default value of Parameter1 to $Parameter2. If you nest the parameters,
the mapping fails with a validation error at runtime.

50

Chapter 3: Mapping Parameters

Expression Parameters
You can configure an expression parameter type. An expression parameter is a parameter that contains a
complete expression. You can use an expression parameter in a Filter transformation and a Lookup
transformation.
Define an expression parameter in the Expression Editor. Select Specify by Parameter to indicate that the
complete expression is parameterized.
The following image shows the Specify by Parameter option for the filter condition:

When you use an expression parameter, you can create the expression parameter or you can select an
existing expression parameter to use in the transformation. An expression parameter can contain ports,
operators, and constants. It cannot contain other parameters.
For example, in a Filter transformation you might create a filter expression parameter with the following
default value: EmployeeID > 100. In the mapping, you might create a different expression parameter with the
following default value: Dept < 2000. If you bind the mapping parameter to the transformation parameter,
you can override the mapping expression parameter at run time. You might create expression parameters
with different port names and operators for dynamic mappings.

Parameters in SQL Statements
You can include parameters in SQL statements that you add to relational data objects or to Lookup
transformations.
The following image shows how you can parameterize an SQL query that reads from a relational source:

Where to Assign Parameters

51

The Data Integration Service escapes the data in each parameter with a single quote (') when it expands a
query. For example, you might have a SQL statement with a date parameter called $date_parm:
select * from  where  >$date_parm
The query might expand to select * from  where  > '01/31/2000 00:00:00'
Do not use a single quote in the default value for a parameter. You might get unexpected results.
A parameter name cannot contain a period (.) . An SQL query is not valid if it has a parameter that contains a
period. For example, the following SQL statement has a parameter name that contains a period:
SELECT $tname.ID,"MY_SOURCE"."NAME" FROM "MY_SOURCE" where FIELDX=1
When you validate the query, the Data Integration Service returns an error that it cannot find the tname.ID
parameter.

Parameters in Custom Queries for Hive Sources
When you use a string parameter in a SQL override, a join expression, or a filter query for a Hive source, you
need to add quotes around the parameter reference if the parameter represents a literal value. You can use
single or double quotes. This requirement is for Hive sources in mappings that run in the native execution
environment or in the Hadoop execution environment.
For example, you need to create a filter that selects Hive source rows with a specific department name. You
create a string parameter that represents the department name. You assign a default value of R&D for the
department name parameter.
The following image shows the string parameter:

52

Chapter 3: Mapping Parameters

When you use the parameter in a filter query for a Hive source, you must include quotes around the
parameter name. Otherwise the mapping fails at run time with a SQL parser error.
The following image shows the filter query for the Hive source on the Query view of the Properties tab:

Note: By default, the Expression editor does not add the quotes around the parameter. You must manually
add them.
You do not need to add single or double quotes around the parameter name if the parameter contains a
column name or a sub query name.
The following image show a string parameter with a default value that is a column name:

The following image shows a filter query that uses the parameter:

Where to Assign Parameters

53

Parameters for Relational Table Resources
You can parameterize the resource name, the table owner, and the connection for a Read transformation. The
resource is the table, view, or synonym name of the relational data object.
You might parameterize the resource name if you need to process multiple tables from the same database in
a dynamic mapping.
Select the Read transformation in the mapping. In the Run-time tab of the Properties view, click the Value
column to assign a parameter for a connection, table owner, or resource.
The following image shows where to assign the parameters for the connection, the resource name, and the
table owner for the Read transformation:

Parameters for Fields and Property Values
You can configure parameters for some field or property values in transformations and physical data objects.
You can configure connection names for relational data objects, customized data objects, and Lookup
transformations. In a flat file data object, you can configure parameters for input and output file directories
and the reject file directory. You can also configure a parameter to change the flat file delimiter type.
The following image shows the parameter for the flat file delimiter on the Advanced tab of the physical data
object:

54

Chapter 3: Mapping Parameters

Parameters for Port Lists
You can create parameters that contain lists of ports. You can reference these parameters in transformations
such as the Sorter transformation, Rank transformation, Joiner transformation, and Expression
transformation.
You can configure the following types of parameters that contain multiple port names:
Port list
A list of port names separated by commas. A port list parameter has the following syntax:
Port1,Port2,Port3
Sort list
A list of port names and the sort type for each port. The sort list parameter has the following syntax:
Port1:A,Port2:A,Port3:D
Input linkset
A set of ports to link at run time. The link set parameter contains name-value pairs in the following
syntax: Port1>:=Port2, Port3>:=Port4

Parameters in Mappings
If you define a parameter in a resusable transformation or data object, you can override the parameter default
value when you add the transformation to a mapping. Configure a mapping parameter to override the
parameter value in a specific mapping.
When you define a mapping parameter, you can bind the mapping parameter to a specific transformation
parameter. The mapping parameter value overrides the default parameter value in the transformation.
When you bind a mapping parameter to a transformation parameter, the parameters must be the same type.
The mapping parameter name does not have to be the same as the transformation parameter name.
You can use a parameter set or a parameter file to set the mapping parameter values at run time. You cannot
set transformation parameter values with a parameter set or parameter file. You must configure mapping
parameters if you want to change parameter values at run time.
Use one of the following methods to define mapping parameters:

Parameters in Mappings

55

Define mapping parameters on the Parameters tab of the mapping Properties view
On the mapping Parameters tab, you can manually enter each parameter name, the parameter attributes,
and the default value. You can bind these parameters to transformation parameters whenever you add a
transformations to the mapping. You can update the mapping parameters on the mapping Parameters
tab. You can also view and access the parameter in the Outline view.
Add mapping parameters from transformation parameters
After you add a transformation to a mapping, you can create a mapping parameter directly from the
transformation Parameters tab. You can expose the transformation parameter as a mapping parameter.
The Developer tool creates a mapping parameter that has the same name and type as the
transformation parameter.
Add parameters to a nonreusable transformation
If you create a transformation in a mapping, the transformation is a nonreusable transformation. If you
parameterize any of the transformation properties, you create mapping parameters instead of
transformation parameters.

Parameter Instance Value
When you add a reusable transformation with parameters to a mapping, you can configure the instance value
for each parameter in the transformation.
The instance value is the parameter value for a specific mapping. You can set the instance value to a default
value, a specific value, or to a mapping parameter value.
A mapping parameter or a mapplet parameter can override the default value of the transformation parameter.
Select a mapping parameter or a mapplet parameter and bind the parameter to the transformation
parameter.
Set the instance value on the transformation Parameters tab in the Properties view.
Choose one of the following options for the Instance Value:
Expose as mapping parameter
Create a mapping parameter with the same attributes as the transformation parameter and bind the
mapping parameter to the transformation parameter in the same step. If you click the Expose as
Mapping Parameter button a second time, and the transformation parameter is already bound to
mapping parameter, the Developer tool does not change the mapping parameter.
Parameter
Browse for and select a mapping parameter to bind to the transformation parameter. You can also
create a mapping parameter and bind it to the transformation parameter. When you create the mapping
parameter and bind it, you are performing the same task as the Expose As Mapping Parameter option.
However, when you manually create the mapping parameter, you can configure a different name than the
transformation parameter.
Use default
Use the default value from the transformation parameter. Skip binding a mapping parameter to the
transformation parameter.
Value
Enter a default parameter value to use in the mapping. Skip binding a mapping parameter to the
transformation parameter.

56

Chapter 3: Mapping Parameters

Parameters in Mapplets
You can bind a mapplet parameter to a parameter in a data object or in a transformation that is in the
mapplet.
When you define a mapplet parameter, you can bind the mapplet parameter to a specific transformation
parameter. The mapplet parameter value overrides the default parameter value in the transformation. When
you bind a mapplet parameter to a transformation parameter, the parameters must be the same type. The
mapplet parameter name does not have to be the same as the transformation parameter name. You can bind
a mapplet parameter to more than one transformation parameter.
Use one of the following methods to define mapplet parameters:
Define mapplet parameters on the Parameters tab of the mapplet Properties view
On the mapplet Parameters tab, you can manually enter each parameter name, parameter attributes, and
default value. After you define a parameter, you can view and access the parameter in the Outline view.
Add mapplet parameters from transformation parameters
After you add a transformation to a mapplet, you can create the mapplet parameter directly from the
transformation Parameters tab.

Parameter Instance Values in Mapplets
When you add a reusable transformation with transformation parameters to a mapplet, you can set the
instance value for each parameter. The instance value of the parameter is the parameter value in a specific
mapplet.
After you add the transformation to a mapplet, set the instance value on the transformation Parameters tab.
Choose one of the following options for the Instance Value:
Expose as mapplet parameter
Create a mapplet parameter with the same attributes as the transformation parameter. Bind the mapping
parameter to the transformation parameter in the same step.
Parameter
Bind a mapplet parameter to the transformation parameter. You can browse for and select a mapplet
parameter to bind to the transformation parameter. You can also create a mapplet parameter and bind
that parameter to the transformation parameter. When you create a mapplet parameter and bind it, you
are performing the same task as the Expose As Mapplet Parameter option. However, when you manually
create the mapplet parameter, you can configure a different name than the transformation parameter
and you can set a different default value.
Use default
Use the default value from the transformation parameter. Skip binding a mapplet parameter to the
transformation parameter.
Value
Enter a different default parameter value to use in the mapplet. Skip binding a mapplet parameter to the
transformation parameter.

Parameters in Mapplets

57

Mapplet Parameters in Mappings
When you add a mapplet with mapplet parameters to a mapping, you can set the instance values for the
mapplet parameters. The instance value of a mapplet parameter is the parameter value for a specific
mapping.
Set the instance value on the mapplet Parameters tab in the Properties view.
Choose one of the following options for the Instance Value:
Expose as mapping parameter
Create a mapping parameter with the same attributes as the mapplet parameter. Bind the mapping
parameter to the mapplet parameter in the same step.
Parameter
Bind a mapping parameter to the mapplet parameter. You can browse for and select a mapping
parameter to bind to the mapplet parameter. You can also create a mapping parameter and bind it to the
mapplet parameter . When you create a mapping parameter and bind it, you are performing the same
task as the Expose As Mapping Parameter option. However, when you manually create the mapping
parameter, you can configure it with a different name and default value than the mapplet parameter.
Use default
Use the default value from the mapplet parameter. Skip binding a mapping parameter to the mapplet
parameter.
Value
Enter a default parameter value to use in the mapping. Skip binding a mapping parameter to the mapplet
parameter.

Parameters in Mapplets Example
You can define mapplet parameters and override them with mapping parameters.
You might define an SQL transformation that returns customer data from a Customer table. You add the SQL
transformation to a mapplet and parameterize the run-time connection.
You then add the mapplet to mappings that retrieve customer data from different databases. You define a
mapping parameter in each mapping to override the default connection from the mapplet parameter.
The following table lists the connection parameters that you might create for the mapplet and for the
mappings:
Object Name

Object Type

Parameter Name

Parameter Default Value

mp_Get_Customer

Mapplet

mp_cust_connection

Oracle_Default

m_billing_mapping

Mapping

m_acctg_connection

Oracle_AcctDB

m_order_fill_mapping

Mapping

m_shipping_connection

Oracle_Warehouse

m_cust_main_mapping

Mapping

m_master_connection

Oracle_Cust_Mast

The mapplet, mp_Get_Customer, has a connection parameter called mp_cust_connection. The parameter has
Oracle_Default as the default connection name. This connection might reference a test database, for
example.

58

Chapter 3: Mapping Parameters

Each mapping has a connection parameter that overrides the mp_cust_connection parameter. Each mapping
connects to the accounting, the warehouse, or the customer master database.
You must bind each mapping parameter to the mapplet parameter to override the default value. To change
the value of the mapping parameters at runtime, you can configure a parameter set or a parameter file.

Parameters in Logical Data Objects
You can include parameters in logical data objects. You can use them in transformations and in the Read and
Write mappings.
A logical data object can have a Read mapping and a Write mapping. A Read or Write mapping can contain
transformations that use parameters. You can bind the reusable transformation parameters to parameters in
the Read or Write mapping.
For example, a logical data object has a Read mapping that contains an Expression transformation. The
Expression transformation has a parameter that defines a decimal value in an expression. The default value
is 100.
When you add the Expression transformation to the Read mapping, you might want to use a different
parameter value. You can create a parameter at the Read mapping level to override the transformation
parameter. Click Expose as Mapping Parameter to create a duplicate parameter in the Read mapping. The
Developer tool binds the duplicate parameter to the transformation parameter.
The following image shows the Parameters tab for the Expression transformation in the Read mapping:

To view the duplicate parameter, select the logical data object in the Outline view. You can change the
parameter default value at the Read mapping level.
When you add the logical data object to a mapplet or mapping, you can override the Read mapping
parameter. Create a duplicate parameter in the mapplet or the mapping. Change the default value of the
duplicate parameter.

Parameters in Logical Data Objects

59

Parameters in Virtual Table Mappings
A virtual table mapping defines the data flow between sources and a virtual table in an SQL data service. A
virtual table mapping can contain parameters, but you cannot use a parameter file or a parameter set to
override the parameter default values.
A virtual table mapping might contain reusable transformations or mapplets that contain parameters. You
can bind mapping parameters to the transformation or mapplet parameters in a virtual table mapping.
However, when a virtual table mapping contains parameters, the Data Integration Service applies the default
parameter values from the mapping level. The Data Integration Service cannot bind values from a parameter
file or from a parameter set to parameters in a virtual table mapping.
You can use a parameterized source that is connected to a virtual table mapping. The mapping uses the
default parameter value.

Parameter Sets
A parameter set is an object in the Model repository that contains a set of parameters and parameter values
to run mappings and workflows.
When you create a parameter set, you choose a mapping or workflow to use the parameters. After you
choose a mapping or workflow, you can manually enter parameters in the parameter set or you can select
parameters that are already in the repository for the mapping or the workflow.
You can use parameter sets for different situations. For example, you might use a specific parameter set
when you run a workflow in a test environment.
You use a parameter set with a mapping, Mapping task, or workflow. You can add one or more parameter
sets to an application when you deploy the application. You can add a parameter set to multiple applications
and deploy them. To use a parameter set with a workflow or mapping, you must add the parameter set to the
application when you deploy the workflow or mapping.
The following image shows a parameter set that contains parameters for two mappings:

The parameter set contains the following information:

60

Chapter 3: Mapping Parameters

Object Name
The name of the mapping, mapplet, or workflow that contains the parameter definition.
Parameter Name
The name of the parameter in the mapping, mapplet, or workflow.
Value
The value of the parameter to use at runtime. The value of the parameter in the parameter set overrides
the parameter value in the mapping or workflow.
Type
The type of the parameter. Example parameter types include strings, numeric types, connections, port
lists, sort lists, and date\time parameters.

Run Mappings with Parameter Sets from infacmd
You can deploy a mapping as an application and include a parameter set in the application. You can then run
the deployed application and use the parameter set.
After you deploy the mapping, you can run the deployed mapping with the parameter set from the command
line. If you need to use different parameter sets, you can deploy more than one parameter set in the
application. When you run the mapping you can specify which parameter set to use.
After the application is deployed, you can add parameter set entries with the infamcd
addParameterSetEntries command. You can update parameter set entries with the infacmd
updateParameterSetEntries command.
For more information about using parameter sets with infacmd, see the Informatica Command Reference.

Parameter Files
A parameter file is an XML file that lists user-defined parameters and their assigned values. Parameter files
provide the flexibility to change parameter values each time you run a mapping from the command line.
The parameter values define properties for a mapping or a mapplet. The Data Integration Service applies
these values when you run a mapping from the command line and specify a parameter file.
You can define mapping parameters and workflow parameters in a parameter file. You cannot define system
parameter values in a parameter file.
You can define parameters for multiple mappings in a single parameter file. You can also create multiple
parameter files and then use a different file each time you run a mapping. The Data Integration Service reads
the parameter file at the start of the mapping run to resolve the parameters.
You can export a parameter file from the Developer tool. Export the file from the mapping or the workflow
Parameters tab. The Developer tool generates a parameter file that contains the mapping or workflow
parameters and the default parameter values. You can specify the parameter file name and choose where to
save the file.
You can also use the infacmd ms ListMappingParams command to list the parameters used in a mapping
with the default values. You can use the output of this command as a parameter file template.
Use the ms RunMapping command to run a mapping with a parameter file.
Note: Parameter files for mappings and workflows use the same structure. You can define parameters for
deployed mappings and for deployed workflows in a single parameter file.
Parameter Files

61

Parameter File Structure
A parameter file is an XML file that contains at least one parameter and its assigned value.
The Data Integration Service uses the hierarchy defined in the parameter file to identify parameters and their
defined values. The hierarchy identifies the mapping, mapplet, or workflow that uses the parameter.
You define parameter values within a project or application top-level element. A project element defines
parameter values to use when you run a specific mapping in the project in any deployed application. A project
element also defines the parameter values to use when you run any mapping that uses the objects in the
project. An application element defines parameter values to use when you run a specific mapping in a
specific deployed application. If you define the same parameter in a project top-level element and an
application top-level element in the same parameter file, the parameter value defined in the application
element takes precedence.
The Data Integration Service searches for parameter values in the following order:
1.

The value specified within an application element.

2.

The value specified within a project element.

3.

The parameter default value.

A parameter file must conform to the structure of the parameter file XML schema definition (XSD). If the
parameter file does not conform to the schema definition, the Data Integration Service fails the mapping run.
On the machine that hosts the Developer tool, the parameter file XML schema definition appears in the
following directory:
\clients\DeveloperClient\infacmd\plugins\ms
\parameter_file_schema_1_0.xsd
On the machine that hosts Informatica Services, the parameter file XML schema definition appears in the
following directory:
\isp\bin\plugins\ms\parameter_file_schema_1_0.xsd

Project Element
A project element defines the parameter values to use when you run a specific mapping in the project in any
deployed application. A project element also defines the parameter values to use when you run any mapping
that uses the objects in the project.
The project element defines the project in the Model repository that contains objects that use parameters.
The project element can include a workflow or mapping. You cannot include a transformation or data source
in the project..
The following table describes the elements that a project element can contain:
Element
Name

Description

folder

Defines a folder within the project. Use a folder element if objects are organized in multiple folders
within the project.
A folder element can contain a dataSource, mapping, mapplet, or transformation element.

mapping

62

Defines a mapping within the project that uses parameters. A mapping element contains one or more
parameter elements that define parameter values for the mapping or for any non-reusable data object,
non-reusable transformation, or reusable Lookup transformation in the mapping that accepts
parameters.

Chapter 3: Mapping Parameters

When you run a mapping with a parameter file that defines parameter values in a project top-level element,
the Data Integration Service applies the parameter values to the specified mapping.
For example, you want the Data Integration Service to apply parameter values when you run mapping
"MyMapping".



Param_value



Application Element
An application element provides a run-time scope for a project element. An application element defines the
parameter values to use when you run a specific mapping in a specific deployed application.
An application element defines the deployed application that contains objects that use parameters. An
application element can contain a mapping element that defines a mapping in the deployed application that
uses parameters. A mapping element contains a project element.
For example, you want the Data Integration Service to apply parameter values when you run mapping
"MyMapping" in deployed application "MyApp." You do not want to use the parameter values when you run
the mapping in any other application or when you run another mapping in project "MyProject." Define the
parameters within the following elements:




Param_value





Rules and Guidelines for Parameter Files
Certain rules and guidelines apply when you create parameter files.
Use the following rules when you create a parameter file:
•

You can reference mapping-level parameters in a parameter file. You cannot reference transformationlevel parameters.

•

Parameter values cannot be empty. For example, the Data Integration Service fails the mapping run if the
parameter file contains the following entry:

•

Within an element, artifact names are not case-sensitive. Therefore, the Data Integration Service interprets
 and  as the same application.

•

A parameter that identifies a reference table must use a forward-slash (/) to separate folder names in a
repository folder path.

 

Sample Parameter File
The following example shows a sample parameter file used to run mappings.

Parameter Files

63








MAP1_PARAM1_VAL
MAP1_PARAM2_VAL






MAP2_PARAM1_VAL
MAP2_PARAM2_VAL









MAP1_PARAM2_VAL







PROJ1_DS1_VAL
PROJ1_DS1_PARAM1_VAL





Project1\Folder1\RTM1



PROJ2_FOLD2_MPLT1_VAL




PROJ2_RULE1_VAL







Export a Parameter File
You can export a mapping parameter file or a workflow parameter file from the Developer tool. Define the
parameters in the Developer tool and then export them to a file. The Developer tool creates a parameter file
in .XML format.
You can export a parameter file that contains mapping parameters or workflow parameters. You can export
parameters from the mapping Parameters tab or from the workflow Parameters tab. The Developer tool
exports all the parameters from the Parameters tab.
To export a parameter file, perform the following steps:
1.

Define the parameters and the parameter defaults for a mapping or a workflow.

2.

On the Parameters tab of the mapping or workflow Properties, click the Export Parameter File option.

3.

Enter a name for the parameter file and browse for a location to put the file.

4.

Click Save.

The following image shows the Export Parameter File option on the Parameters tab for a workflow:

When you export a parameter file, the Developer tool creates a parameter file with either mapping parameters
or workflow parameters in it. The Developer tool does not export mapping and workflow parameters to the
same file.

Parameter Files

65

For example, when you export the workflow parameter, Workflow_Parm, the Developer tool creates the
following parameter file:

-
-
-
100




Creating a Parameter File from infacmd ms ListMappingParams
The infacmd ms ListMappingParams command lists the parameters for a mapping in a deployed application
and the default value for each parameter. Use the output of this command to create a parameter file.
1.

Run the infacmd ms ListMappingParams command to list the parameters for a mapping and the default
value for each parameter.
The -o argument sends command output to an XML file.
For example, the following command lists the parameters in mapping MyMapping in file
"MyOutputFile.xml":
infacmd ms ListMappingParams -dn MyDomain -sn MyDataIntSvs -un MyUser -pd MyPassword
-a MyApplication -m MyMapping -o MyOutputFile.xml
The Data Integration Service lists all mapping parameters with their default values.

2.

If you did not specify the -o argument, you can copy the command output to an XML file and save the
file.

3.

Edit the XML file and replace the parameter default values with the values you want to use when you run
the mapping.

4.

Save the XML file.

Running a Mapping with a Parameter File
Use the infacmd ms RunMapping command to run a mapping with a parameter file. The -pf argument
specifies the parameter file name.
For example, the following command runs the mapping MyMapping using the parameter file
"MyParamFile.xml":
infacmd ms RunMapping -dn MyDomain -sn MyDataIntSvs -un MyUser -pd MyPassword -a
MyApplication -m MyMapping -pf MyParamFile.xml
The Data Integration Service fails the mapping when you run it with a parameter file and the parameter file is
not valid. The Data Integration Service fails the mapping if it cannot find the parameter file or it cannot
access the parameter file.

How to Configure Parameters
Define parameters in a transformation, a mapping, a mapplet, or a workflow.
The following image shows the process to use parameters in a reusable transformation in a mapping:

66

Chapter 3: Mapping Parameters

1.

In a reusable transformation, create a parameter for a property in the transformation or for a variable in
the Expression Editor.

2.

Add the transformation to a mapping or to a mapplet.

3.

In the transformation Parameters tab, choose how to set the parameter value in the mapping or the
mapplet.
•

Expose the transformation parameter as a mapping parameter. Creates a duplicate of the
transformation parameter at the mapping level.

•

Bind the transformation parameter to a mapping parameter. Browse for a mapping parameter or
manually create a mapping parameter to bind to the transformation parameter.

•

Enter a specific parameter value. Enter a default value to use in the mapping run.

•

Use the default value of the transformation parameter. Use the original parameter value in the
mapping.

After you bind a mapping parameter to the transformation parameter, you can create parameter sets to
override the mapping parameter value at run time. Run the mapping from the command line and specify
which parameter set to use for that mapping run.

Creating a Parameter for a Transformation Property
When you assign a parameter to field or to a transformation property, you can browse for a parameter to use
or you can create a parameter specifically for the field.
1.

Navigate to the field or property that you want to update.

2.

Click the selection arrow in the Value column.
If you can parameterize the property, the Assign Parameter option appears.
The following image shows the Assign Parameter option for the Cache Directory:

How to Configure Parameters

67

3.

Click Assign Parameter.
The Assign Parameter dialog box appears. The dialog box shows the system parameters and the userdefined parameters that you created in the transformation.
The following image shows the Assign Parameter dialog box:

4.

To create a parameter, click New.

5.

Enter the parameter name, the type, the precision, and the default value.
The following image shows the parameter called MyCacheParameter in the Parameters dialog box:

68

Chapter 3: Mapping Parameters

6.

Click OK.
The parameter name appears in the transformation property.
The following image shows MyCacheParameter in the Aggregator transformation cache directory:

You can add, change, and delete parameters in the transformation Parameters tab.

Creating a Parameter in an Expression
You can reference the parameter in an Expression after you define the parameter. The following example
shows how to assign a parameter to a component in a filter expression.
1.

In the Filter transformation, click the Filter tab.
The Expression Editor appears. You can select functions, ports, and parameters to create the expression.

2.

Select Specify By Value to define the expression instead of using an expression parameter.

How to Configure Parameters

69

3.

On the Filter tab, click the Ports tab.

4.

Select the Amount port. On the Functions tab, select the greater than (>) function.
The following image shows the expression that contains the Amount port and the > operator:

5.

Click the Parameters tab in the Expression Editor.
The Expression Editor lists the system parameters and the user-defined parameters.

6.

Click Manage Parameters to add a parameter.
The Parameters dialog box appears.

7.

Click New.
A dialog box appears with default parameter values.

8.

Enter the parameter name, parameter type, precision, and default value.
The following image shows the Parameters dialog box:

70

Chapter 3: Mapping Parameters

9.

In the Expression Editor, click OK
The parameter that you created appears in the parameter list.

How to Configure Parameters

71

10.

Select the Min_Order_Amount parameter to add it to the expression.
The Min_Order_Amount parameter appears in the expression.

The parameter appears in the expression with a dollar sign ($) identifier. The Min_Order_Amount default
value is 50. If you add the transformation to a mapping without overriding the Min_Order_Parameter, the
Filter transformation returns rows where the Amount is greater than 50.

Expose Transformation Parameters as Mapping Parameters
After you add a transformation to a mapping, you can expose a transformation parameter as a mapping
parameter. When you expose a transformation parameter as a mapping parameter, you create a mapping
parameter that is a duplicate of the transformation parameter.
The instance value of a parameter is the parameter value to use in a specific mapping or mapplet. When you
expose a transformation parameter as a mapping parameter, you create a mapping parameter and bind it to a
transformation parameter in one step. You can set mapping parameter values at run time with a parameter
set or parameter file.
You can click Expose as Mapping Parameter one time for a transformation parameter. If you click Expose as
Mapping Parameter and the transformation parameter is already bound to a mapping parameter, the
Developer tool does not change the mapping parameter. The Developer tool does not create another mapping
parameter, and it does not update the mapping parameter default value. After you create a mapping

72

Chapter 3: Mapping Parameters

parameter, multiple objects might use the mapping parameter. If you need to change the mapping parameter
default value, change the value in the mapping or change it at run time.
1.

Open the mapping. Select the mapping in the Outline view.
The Parameters tab appears in the Properties view.
The following image shows the Parameters tab for a Filter transformation:

2.

To create a mapping parameter for the parameter, select the parameter and click Expose as Mapping
Parameter.
The Developer tool creates a mapping parameter with the same name and it binds it to the
transformation parameter.

3.

To update a mapping parameter, select the parameter from the Outline view.
You can change the default mapping parameter value. You can also add mapping parameters on the
mapping Parameters tab.

Setting the Parameter Instance Value
You can set the parameter instance value from the Instance Value column on the transformation Parameters
tab. Set the instance value in this column if you do not want to create a duplicate mapping parameter.
You can set a transformation parameter to a default value or you can bind an existing mapping parameter to
the transformation parameter.
1.

After you add a transformation to a mapping, click the Parameters tab on the Properties view of the
transformation.

2.

To bind a mapping parameter to a transformation parameter, perform the following steps:
a.

Click the Instance Value column for the transformation parameter.
The Specify By dialog box appears.

b.

Click Specify By Parameter.

c.

On the Assign Parameter dialog box, browse for and select a mapping parameter or a systemdefined parameter to bind to the transformation parameter.

d.

Click OK.

How to Configure Parameters

73

The mapping parameter name appears as the parameter value in the Specify By dialog box.
e.

Click OK in the Specify By dialog box.
The mapping parameter name appears in the Instance Value column.

3.

To set a default value for the transformation parameter instance, use the following steps:
a.

Click the Instance Value column for the transformation parameter.
The Specify By dialog box appears.

b.

To enter a default value, click Specify By Value and enter a default value for the instance.

c.

To use the transformation parameter default value, click Use Default.

Creating a Parameter Set
Create a parameter set that you can use to change the runtime context for mappings and workflows.
When you create the parameter set, choose a mapping or workflow to contain the parameters. After you
choose a mapping or workflow, you can manually enter parameters in the parameter set or you can select
parameters.

74

1.

In the Object Explorer view, right-click a project and click New > Parameter Set.

2.

Enter a name for the parameter set and click Finish.

3.

Drag the Properties panel down and view the grid to add the parameters to the parameter set.

4.

Click New > Mapping/Workflow.

Chapter 3: Mapping Parameters

5.

In the Add Parameters dialog box click Browse to find the mapping or workflow that contains the
parameters you need to include in the set.
A list of mappings and workflows appears.

6.

Select a mapping or a workflow and click OK.
A list of parameters from the mapping or workflow appears.

7.

Select the parameters to include in the parameter set and then click OK.
The mapping or the workflow name and the path appears in the parameter set. Each parameter that you
selected appears beneath the object.

How to Configure Parameters

75

8.

To add a parameter that is not yet in a workflow or mapping, right-click a mapping or object name and
select Parameter insert.
The Developer tool creates a parameter beneath the workflow or mapping. Change the parameter name,
the value, and the type.
Note: You must add the parameter to the mapping or workflow before you use the parameter set.

76

Chapter 3: Mapping Parameters

Chapter 4

Mapping Outputs
This chapter includes the following topics:
•

Mapping Outputs Overview, 77

•

User-Defined Mapping Outputs, 78

•

System-Defined Mapping Outputs, 81

•

Persisted Mapping Outputs, 82

•

Bind Mapping Outputs to Workflow Variables, 84

•

Mapping Outputs In Mapplets, 85

•

Mapping Outputs in Logical Data Objects, 88

•

How to Configure Mapping Outputs, 88

•

How to Bind Mapplet Outputs to Mapping Outputs, 97

Mapping Outputs Overview
A mapping can return mapping outputs. A mapping output is a single value that is the result of aggregating a
field or expression from each row that the mapping processes.
A mapping output returns a value that provides information about the mapping run. For example, a mapping
output can return the number of error rows that the mapping found. A mapping output can return the latest
order date that the mapping processed and the total amount of all the orders.
Transformations do not receive the mapping output values. The mapping returns each mapping value when
the mapping completes. You can pass mapping outputs to other tasks in the workflow. You can save the
values to use as input parameters the next time a mapping runs. You can define multiple mapping outputs in
the same mapping.
A mapping can return user-defined mapping outputs or system-defined mapping outputs.
User-defined mapping outputs
A user-defined mapping output is a numeric value or date that a mapping returns by aggregating a field
or expression from each row in the mapping. For example, you might need to know when orders reach a
specific threshold. You can configure the mapping to return the total order amount that the mapping
processed. Define a mapping output called TotalOrderAmt and configure the mapping to summarize the
Order_Amount field from every row. Define an expression or port name to aggregate in an Expression
transformation.

77

System-defined mapping outputs
A system-defined mapping output is a built-in value that the mapping always returns whenever the
mapping completes. The mapping returns the number of source rows, the number of target rows, and the
number of error rows that the mapping processes. You might pass these values in workflow variables to
another task in a workflow, such as a Notification task or an Exclusive Gateway task. You do not have to
define a system-defined mapping output.
Perform the following tasks with mapping outputs:
Save the mapping output in the repository
You can configure a Mapping task to persist a mapping output value in the repository. You can assign a
persisted mapping output value to a Mapping task input. For example, you can configure the mapping to
return the latest sequence number that it generated. Persist a Last_Seq_Num mapping output in the
repository. The next time the mapping runs, you can use Last_Seq_Num as an the starting sequence
number.
Bind outputs to workflow variables
You can bind mapping outputs to workflow variables and then pass the values to other tasks in a
workflow. You can bind mapping outputs from the current Mapping task run to workflow variables. You
can also bind persisted mapping outputs from a previous Mapping task run to workflow variables in the
current run.

User-Defined Mapping Outputs
A user-defined mapping output is a numeric value or a date that a mapping returns by aggregating a field or
expression from each row in the mapping. Define the expression to aggregate and the data type of the result
value.
Define the mapping output in the Outputs tab of the Properties view. Configure a name for the mapping
output, a data type for the result, and indicate what type of aggregation to perform to return a result.
After you define the mapping output name and the output type, configure an Expression transformation in the
mapping. In the Expression transformation, define the output expression that you want to aggregate. The
expression can contain a port name or it can contain an expression with ports, functions, and parameters.

Outputs View
Define the mapping outputs in the Outputs view of the mapping Properties. When you define each mapping
output, enter a mapping output name, the mapping output type, and the type of aggregation to perform.
The following image shows the mapping outputs in the Outputs tab of the mapping Properties view:

78

Chapter 4: Mapping Outputs

The Outputs view contains the following fields:
Name
The name of the output. Default is Output.
Type
The type of the mapping output. You can select a numeric type or a date/time type. Default is Integer.
Precision
The length of the mapping output field. Default is 10.
Scale
The digits to the right of the decimal in the mapping output field. Default is zero.
Aggregation Type
You can choose one of the following types of aggregation:
SUM
Returns the sum of the field or expression from each input row that the Data Integration Service
processed.
MIN
Returns the smallest numeric value or date that the Data Integration Service processed from a
specific field or expression in each input row.
MAX
Returns the largest numeric value or date that the Data Integration Service processed from a
specific field or expression in each input row.
Binding
The name of a mapplet or logical data object output to bind to the mapping output. This field is blank
unless the mapping output is returned from a mapplet instead of from an Expression transformation in
the mapping.
Description
The description of the mapping output.

User-Defined Mapping Outputs

79

Mapping Output Expression
Configure a mapping output expression in the Mapping Outputs view of an Expression transformation. The
mapping output expression is a field or an expression to aggregate from the rows that the Expression
transformation receives.
Configure an Expression transformation in the mapping and include the output expressions that you want to
aggregate. The location of the Expression transformation in the pipeline might affect the mapping output
results depending on whether the mapping contains filters or active transformations. You can add more than
one Expression transformation to the mapping if you need to aggregate rows in different pipelines.
The following image shows the expressions in the Mapping Outputs view of the Expression transformation:

The Mapping Outputs view has the following fields:
Outputs Name
The name of a mapping output that you created at the mapping level. You must create the mapping
output at the mapping level first. When you add the mapping output in the Expression transformation,
you select the output name from a list of the outputs that you have already created.
Expression
The expression to aggregate for each row in the mapping. Enter a port name or enter an expression in
the Expression Editor. The expression result must be numeric or a date. You can use parameters in the
expression. The Data Integration Service applies the expression for each row that the Expression
transformation receives. Each mapping output returns one value when the mapping completes.
Note: You do not specify the type of aggregation to perform in the Expression transformation. You indicate
the field or expression that the mapping aggregates as it processes each row.

80

Chapter 4: Mapping Outputs

System-Defined Mapping Outputs
System-defined mapping outputs are mapping outputs that each mapping generates. You do not have to
configure the aggregation for system-defined mapping outputs. You can pass system-defined mapping
outputs to workflow variables.
A mapping returns the following types of system-defined mapping outputs:
numberOfTargetRows
The number of rows that the mapping wrote to the target.
numberOfSourceRows
The number of rows that the mapping read from the source.
numberOfErrorRows
The number of error rows that the mapping generated.
Configure the workflow variables to assign the system-defined mapping outputs to on the Output tab of the
Mapping task Properties view.
The following image shows the system-defined mapping outputs on the Output tab:

System-Defined Mapping Outputs

81

Persisted Mapping Outputs
You can save the mapping outputs in the repository if you run the mapping in a workflow. You can use a
mapping output in a subsequent run of the same Mapping task. You can also assign persisted mapping
outputs from the previous Mapping task run to workflow variables for the current Mapping task run.
Persist a mapping output in a Mapping task. A Mapping task is an instance of the mapping in a workflow with
the mapping configuration and parameter bindings. For more information about Mapping tasks, see the
Informatica Developer Workflow Guide.
When you click the Persistence tab in the Mapping task Properties view, the Developer tool displays all the
mapping outputs for the mapping. To persist any mapping output, enable Persist for the mapping output and
select the type of aggregation to perform to return a persisted value.
When the Data Integration Service persists a mapping output in the Model repository, the Data Integration
Service saves the mapping output with the Mapping task name as a key. For example, if a workflow contains
four mapping tasks, each running the same mapping, the Data Integration Service saves four outputs in the
Model repository.
When you persist a mapping output, you can configure a different aggregate function for the persisted value
than the aggregate function you defined at the mapping level. The Data Integration Service generates more
than one mapping output value. For example, the OrderDate mapping output might contain the MIN
OrderDate. The persisted OrderDate mapping output might contain the MAX OrderDate.
You can bind the mapping output from a Mapping task to the input parameter of the Mapping task the next
time it runs. Feedback binding is when you configure the results from one mapping run as input to the same
mapping the next time it runs. You must persist the mapping output in a Mapping task to use it for feedback
binding.
An example of feedback binding is to persist the latest order date that the mapping processes. The next time
the Mapping task runs, the input parameter to the mapping is the last date processed. The mapping can filter
the parameter source rows to include the rows with an order date greater than the last order date processed.
The following image shows the Persistence tab on the Properties view of the Mapping task:

The Persistence tab has the following fields:
User-Defined Output
The name of a mapping output that the mapping returns.

82

Chapter 4: Mapping Outputs

Persist
Enables the Data Integration Service to persist the mapping output in the repository.
Aggregate Function
The type of aggregation to perform on the mapping output to persist. Select MIN, MAX, or SUM. The
default is the value from the mapping output that you define in the mapping properties. You can change
the aggregate function type of the persisted mapping output. You can persist a different value in the
repository than the mapping output value that you pass to workflow variables.
Description
Describes the mapping output to persist in the repository.

Persisted Values Maintenance
You can list, update, and reset the persisted mapping outputs in the repository.
You can run the following infacmd commands for persisted mapping task values:
listMappingPersistedOutputs
Lists the persisted mapping outputs and their values for a Mapping task instance in a workflow.
setMappingPersistedOutputs
Updates or resets the persisted mapping outputs for a specific Mapping task instance in a workflow.
When you reset the values, you remove the persisted values from the repository. To set mapping outputs
enter space-separated name-value pairs of mapping outputs in the command line. To reset mapping
outputs use the -reset option with a space-separated list of mapping outputs.
For more information about infacmds, see the Informatica Command Reference.

Persisted Mapping Outputs and Deployment
When you redeploy a workflow or you change a mapping output, you can affect the state of persisted
mapping outputs.
Consider the following rules and guidelines for persisted mapping outputs:
•

When you deploy a workflow as an application for the first time, you do not have to perform any additional
tasks if a Mapping task has persisted mapping outputs.

•

When you redeploy an application, you can choose whether to retain the state information or to discard it.
If you choose to retain the state information, the mapping output values do not change in the repository
when you redeploy the application. Otherwise the mapping outputs state is removed from persistence.

•

The mapping outputs state is not backed up when you back up and restore a mapping or workflow.

•

If you rename or recreate a mapping output, you cannot use the persisted mapping output value from a
previous workflow run.

Persisted Mapping Outputs

83

Bind Mapping Outputs to Workflow Variables
After you add a mapping to a workflow, you can bind mapping outputs to workflow variables. You can pass
the values to other tasks in the workflow.
For example, you might want the Data Integration Service to evaluate the mapping output value and then
determine which object to run next. Or, you might want the Data Integration Service to use the mapping
output value in a field in the next task.
To use the mapping output in another task, bind the mapping output to a workflow variable in the Mapping
task Output view.
Note: If you assign a mapping output to a workflow variable and the mapping processes no rows, the output
is NULL. The mapping task does not change the value of the workflow variable. The variable retains the same
value as it was before the Mapping task ran.
The following image shows the Mapping task Output view.

The Output column contains the following types of mapping outputs:
System-Defined Mapping Outputs
Built-in mapping outputs that the transformations return to the mapping. The system-defined mapping
outputs contain the number of source rows, the number of target rows, and the number of error rows the
mapping processed.
User-Defined Mapping Outputs
You can bind persisted mapping output values and current mapping output values to workflow variables.

84

Chapter 4: Mapping Outputs

Persisted Values
The user-defined mapping output values from the previous workflow run. The persisted value is a
value that is in the repository from the last time the Mapping task ran. The persisted value is not the
value that the current mapping aggregates.
Current Values
The user-defined mapping output values from the current Mapping task.
For more information about workflow variables, see the Informatica Developer Workflow Guide.

Mapping Outputs In Mapplets
You can configure a mapplet to return mapping outputs. You can bind the mapping outputs from a mapplet to
the mapping outputs at the mapping level.
When you include a mapplet in mapping, the mapplet computes the value of the outputs and passes the
output values to the mapping. You can bind more than one output from a mapplet to the same output at the
mapping level. You can also bind system-defined outputs from a mapplet to the mapping outputs. The
mapplet outputs and the mapping outputs must be the same type.
For example, a mapplet might return the maximum value of a Salary port, a Bonus port, and a Commission
port in three mapping outputs.
The following image shows the Out_Salary, Out_Bonus, and Out_Commission mapping outputs in the Outputs
view:

The Outputs view contains the following fields:
Name
The name of the output. Default is Output.

Mapping Outputs In Mapplets

85

Type
The type of the mapping output. You can select a numeric type or a date/time type. Default is Integer.
Precision
The length of the mapping output field.
Scale
The digits to the right of the decimal in the mapping output field.
Aggregation Type
The type of aggregation to perform on the output expression. Choose SUM, MIN, or MAX. Default is SUM.
Binding
The name of an output from another mapplet to bind to this mapping output. The Binding field is blank
unless the mapplet contains another mapplet that is returning the mapping output.
Description
The description of the mapping output.
For each mapping output in the mapplet, create an associated output expression in the Expression
transformation. Each expression identifies the fields to aggregate.
The following image shows the mapping output expressions in the Expression transformation:

For this example, the Expression transformation aggregates the Salary and Commission port values. The
Out_Bonus mapping output is an expression that includes the Bonus port value plus 10% of the Bonus.

Bind Mapplet Outputs to Mapping Outputs
If a mapplet computes mapping outputs, you need to pass the output values from the mapplet to a mapping.
Bind the mapplet outputs to the mapping outputs on the mapping Outputs view.
The following image shows the Max_Compensation mapping output at the mapping level:

86

Chapter 4: Mapping Outputs

At the mapping level, you can bind the Salary mapplet output, the Bonus mapplet output, and the Commission
mapplet output to the same mapping output called Max_Compensation.
To bind a mapplet output to a mapping output, click the Binding column for the mapping output. A list of
available mapplet outputs appears. The list contains mapplet outputs that are the same type and aggregation
as the mapping output. Select the mapplet outputs to assign to the mapping output.
The following image shows the Outputs Binding dialog box:

After you select the outputs to bind to Max_Compensation, the Binding field contains the following text:
Mplt_Compensation.Out_Salary,Mplt_Compensation.Out_Bonus,Mplt_Compensation.Out_Commission
The following image shows the mapping outputs in the Binding field:

Mapping Outputs In Mapplets

87

Set the mapping output aggregation type is MAX. The Data Integration Service returns the highest
compensation value that it finds from the Salary, the Bonus, or the Commission ports.

Mapping Outputs in Logical Data Objects
A logical data object can contain a Read or Write mapping. You can configure these mappings to return
mapping outputs. You can bind the mapping outputs from the logical data object to the mapping outputs in
the mapping.
When you include the logical data object in a mapping, the Read or Write mapping computes the value of the
mapping outputs. The logical data object passes the output values to the mapping. You can bind more than
one output from the logical data object mapping to the same output at the mapping level. You can also bind
system-defined outputs from the logical data object to the mapping outputs. The logical data object mapping
outputs and the mapping outputs must be the same type.

How to Configure Mapping Outputs
When you configure mapping outputs, define the mapping outputs at the mapping level, configure the
expressions to aggregate at the transformation level, and persist the results at the Mapping task level.
The following image shows the process to configure mapping outputs:

88

Chapter 4: Mapping Outputs

To configure mapping outputs, perform the following steps:
1.

Create the mapping.

2.

In the Outputs view of the mapping, define the mapping output name and type of aggregation.

3.

Add an Expression transformation to the mapping and configure the mapping output expression in the
Expression Mapping Outputs view.

4.

To create a Mapping task, add the mapping to a workflow .

5.

Persist the mapping output in the Mapping task Persistence view and configure the aggregation function
type for the persisted value.

6.

Assign the persisted mapping output to an input parameter in the Mapping task.

7.

If you want to use the mapping output in another workflow task, assign the mapping output to a
workflow variable .

Creating a Mapping
Create a mapping that contains a reusable Filter transformation. The Filter transformation filters rows that
have order dates less than a specific date. The filter expression includes a parameter called
Last_Order_Date_Parm.
1.

Create a mapping to process order data from a Customer_Order file.

2.

In the mapping Properties view, click the Parameters tab.

3.

Add a date/time mapping parameter called Last_Order_Date_Parm.

How to Configure Mapping Outputs

89

Enter a default date for the starting parameter.
The following image shows the mapping parameter:

4.

Create a reusable Filter transformation to filter Customer_Order rows.

5.

Define a parameter in the Filter transformation called Order_Filter.
Enter a default date for the starting parameter.

6.

Add a filter expression to find order dates that are greater that the parameter:

7.

Add the Filter transformation to the mapping.

8.

Click the Filter transformation to display the transformation Properties view.

9.

Click the Parameters tab.

10.

90

To bind the Order_Filter_Parm transformation parameter to the Last_Order_Date mapping parameter,
click the Instance Value column for the Order_Filter_Parm.

Chapter 4: Mapping Outputs

11.

Select Last_Order_Date.
The following image shows where the mapping parameter is bound to the transformation parameter:

Defining Mapping Outputs
Create a mapping and define the mapping outputs in the mapping Properties. Each mapping output definition
describes what type of aggregation to perform and the data type of the results.
1.

After you create a mapping, click the editor to access the mapping Properties.

2.

Click the Outputs view.

3.

Click New to create a mapping output.
The Developer tool creates a mapping output with default field values.
The following image shows the mapping output default values in the Outputs view:

4.

Change the name that identifies the mapping output.

5.

Select a numeric or date mapping output type. Enter the precision and scale.

6.

Choose the aggregation type for the mapping output.
You can summarize the output expression or you can find the minimum or maximum expression value
that the mapping processed. Default is SUM.

7.

Click File > Save to save the mapping output.
You must save the mapping output before you can create a mapping output expression in the Expression
transformation.

The following image shows a mapping output that contains the sum of a decimal field and a mapping output
that contains a maximum date value:

How to Configure Mapping Outputs

91

Configuring the Mapping Output Expression
In the Expression transformation, configure the expression to aggregate for each row that the mapping
processes.
1.

Add an Expression transformation to the mapping.
Consider the mapping logic before you decide where to place the transformation. The mapping output
contains an aggregation of the rows that the Expression transformation receives.

2.

In the Expression transformation, click the Mapping Outputs view.

3.

Click New to add a mapping output expression.
The Developer tool creates a mapping output with a output name that matches one of the mapping
outputs you created at the mapping level. If you have more than one mapping output in the mapping
Properties, select the appropriate mapping output name to use.
The following image shows the Mapping Outputs view in the Expression transformation:

92

Chapter 4: Mapping Outputs

4.

Click the Expression column to enter an expression in the Expression Editor.
The expression can contain just a port name or it can contain functions, ports, and parameters.
The following image shows an expression to calculate the Total_OrderAmt in the Expression Editor:

5.

Click Validate to verify that the expression is valid.

6.

Click OK to save the expression.

How to Configure Mapping Outputs

93

The expression appears in the Expression column for the mapping output.
7.

Click File > Save to save the Expression transformation.

Persisting Mapping Outputs
After you add the mapping to a workflow, you can persist mapping outputs from the Mapping task. You can
use persisted mapping outputs as input to the Mapping task the next time it runs.
1.

Add the mapping to a workflow to create a Mapping task.

2.

Click the Mapping task icon in the workflow to view the Mapping task Properties.

3.

Click the Persistence view.
A list of the user-defined mapping outputs appears in the Persistence view.

4.

94

Enable Persist to save the mapping output after the Mapping task runs.

Chapter 4: Mapping Outputs

5.

Optionally, change the aggregation type and enter the description.
The following image shows the Persistence view for a Mapping task:

The Last_Order_Date mapping output is persisted. The aggregate function is MAX, so the Data
Integration Service saves maximum order date value in the repository.

Assigning Persisted Outputs to Mapping Task Input
You can bind persisted mapping outputs from a Mapping task to the input parameters of the same Mapping
task for the next time the workflow runs.
Assign the persisted latest order date value from the Mapping task as the input parameter to the same
Mapping task. Configure a Filter transformation that uses a Last_OrderDate_Parm parameter to select which
orders to process. The filter expression to select input rows is Order_Date > Last_OrderDate_Parm.
1.

Click the Mapping task icon in the workflow to view the Mapping task Properties view.
A list of the Mapping task input parameters and a list of the parameterized Mapping task configuration
properties appears. The mapping must have a mapping parameter to assign the mapping output to.

2.

Locate the mapping input parameter that you want to bind the mapping output to. Double-click the Value
column to view the selection arrow.

3.

Click the selection arrow to view a list of the parameters and variables that you can assign to the input
parameter.

4.

Scroll to the User Outputs section of the list and choose the persisted mapping output to use.
The following image shows Last_OrderDate_Parm mapping parameter on the Mapping task Input view:

How to Configure Mapping Outputs

95

5.

Select the mapping output to assign to the parameter.
The mapping output name appears in the value column for the input parameter.

6.

Click File > Save to save the Mapping task.
The Last_OrderDate_Parm is bound to the persisted order date value from the repository.

Binding Mapping Outputs to Workflow Variables
You can bind mapping outputs to workflow variables and pass the values to other tasks in the workflow.
To pass the mapping output value to another task, bind the mapping output to a workflow variable in the
Mapping task Output view. You can bind mapping outputs from the current Mapping task or you can bind the
persisted mapping outputs from the previous Mapping task run.
1.

Add the mapping with the mapping outputs to a workflow.

2.

Click the Mapping task icon in the workflow to view the Mapping task Properties.

3.

In the Mapping task Properties, click the Output view.
The Mapping task Output view shows the data that you can pass from the task into workflow variables.

4.

Find the mapping output that you want to bind to a variable.

5.

Double-click the Variable column to access the selection arrow and view a list of workflow variables.
The following image shows where to bind the Total_Order_Amt mapping output to the
wf_Variable_Total_OrderAmt workflow variable in the Mapping task Output view:

96

Chapter 4: Mapping Outputs

6.

To create a workflow variable, click the New Variable option from the list of workflow variables in the
Value column.
Enter the variable name, type, and default value.

How to Bind Mapplet Outputs to Mapping Outputs
You can configure a mapplet to return mapping outputs. You can bind the mapping outputs from the mapplet
to the mapping outputs at the mapping level.
When you include a mapplet in mapping, the mapplet computes the value of the outputs and passes the
output values to the mapping. You can bind more than one output from a mapplet to the same output at the
mapping level. You can also bind system-defined outputs from a mapplet to the mapping outputs.
The following image shows the process to configure mapplet outputs and bind them to mapping outputs:

How to Bind Mapplet Outputs to Mapping Outputs

97

To bind outputs from mapplets to mapping outputs, perform the following steps:
1.

Create the mapplet.

2.

Define the mapplet output name and the type of aggregation in the Outputs view of the mapplet.

3.

Add an Expression transformation to the mapplet and configure the mapping output expression in the
Expression Mapping Outputs view.

4.

Add the mapplet to a mapping.

5.

Create mapping outputs in the mapping.

6.

Bind the outputs from the mapplet to the mapping outputs.

Defining Mapplet Outputs
Create a mapplet and define the mapping outputs in the Outputs tab of the mappletProperties view. Each
mapping output definition describes what type of aggregation to perform and the data type of the results.
1.

After you create a mapplet, click inside the mapping canvas to access the mapplet properties.

2.

Click the Outputs view.

3.

Click New to create a mapping output.
The Developer tool creates a mapping output with default field values.

4.

Change the name that identifies the mapping output.

5.

Select a numeric or date mapping output type. If you are creating a numeric type, enter the precision and
scale.

6.

Choose the aggregation type for the mapping output.
You can summarize the output expression or you can find the minimum or maximum expression value
that the mapping processed. Default is SUM.
The following image shows a mapplet output called Total_Sales_Tax with an aggregation type SUM:

98

Chapter 4: Mapping Outputs

7.

Click File > Save to save the mapping output.
You must save the mapping output before you can create any mapping output expression in an
Expression transformation.

Configuring a Mapping Output Expression in a Mapplet
Configure the expression to aggregate for each row that the mapplet processes.
1.

Add an Expression transformation to the mapplet.
Consider the mapplet logic before you decide where to place the transformation.

2.

In the Expression transformation, click the Mapping Outputs view.

3.

Click New to add an output expression.
The Developer tool creates a mapping output with a output name that matches one of the mapping
outputs you created at the mapplet level. You might have more than one output to choose from.

4.

Enter an expression with the Expression Editor.
The expression can contain a port name or it can contain functions, ports, and parameters.

5.

Click Validate to verify that the expression is valid.

6.

Click OK to save the expression.
The following image shows the Mapping Outputs view with a mapping output expression that calculates
a sales tax:

How to Bind Mapplet Outputs to Mapping Outputs

99

7.

Click File > Save to save the Expression transformation.

Binding Outputs from a Mapplet to Mapping Outputs
When you include the mapplet in a mapping, you can bind the outputs from the mapplet to the mapping
outputs that you define at the mapping level.
1.

Define a mapping and add the mapplet to the mapping.

2.

Click the mapping canvas to view the mapping Properties.

3.

Click the Outputs view.

4.

Click New to create a mapping output.
The Developer tool creates a mapping output with default field values.

5.

Change the mapping output type, the aggregation type, the precision and the scale to match fields from
the mapplet that you want to bind to.

6.

Optionally, change the name and enter a description.

7.

Click the selection arrow in the Binding field to view a list of outputs.
The following image shows the Outputs Binding dialog box:

100

Chapter 4: Mapping Outputs

8.

Select the mapplet ouput to bind to the mapping output.
You can select more than one mapplet output to bind to the same mapping output.

9.

Click OK.
The mapplet outputs that you select appear in the Binding field.
The following image shows the mapplet output name in the Binding field of the mapping output:

How to Bind Mapplet Outputs to Mapping Outputs

101

Chapter 5

Generate a Mapping from an SQL
Query
This chapter includes the following topics:
•

Generate a Mapping from an SQL Query Overview, 102

•

Example of Generated Mapping from an SQL Query, 102

•

SQL Syntax to Generate a Mapping, 103

•

Function Support in Queries that Generate a Mapping, 104

•

Generating a Mapping or Logical Data Object from an SQL Query, 106

•

Generate a Mapping from an SQL Statement, 106

Generate a Mapping from an SQL Query Overview
You can generate a mapping from an SQL query in the Developer tool. To generate a mapping, you can enter
an SQL query or you can load a text file that contains the query. Optionally, you can define the source of the
query table. The Developer tool validates the SQL query and generates a mapping.
You can also generate a logical data object from an SQL query that contains only SELECT statements.

Example of Generated Mapping from an SQL Query
You have a table of employees, and you want a list of employee salaries for employees that were hired after
January 1, 2001.
You create the following SQL statement:
SELECT LastName, Salary from emp1 where HireDate > 01/01/2001
The following image shows the mapping created from the SQL statement:

102

SQL Syntax to Generate a Mapping
You can use an ANSI-compliant SQL statement to generate a mapping in the Developer tool.
The Developer tool can generate a mapping from a standard SELECT query. For example:
SELECT column_list FROM table-name
[WHERE clause]
[GROUP BY clause]
[HAVING clause]
[ORDER BY clause]
If the SELECT SQL statement contains a correlated subquery, the query is valid if it is possible to flatten or
rewrite the query as a single standard query.
ANSI SQL does not support some datatypes. For example, if the query requests results from a data source
where one of the columns has the type timeStampTZ, the SQL is not valid.

Correlated Subqueries
A correlated subquery is a subquery that uses values from the outer query in its WHERE clause. The Data
Integration Service flattens the correlated subqueries before it runs the query.
The following table shows the results of a correlated subquery that the Data Integration Service flattened:
Type

Query

Non-flattened

SELECT huge.* FROM huge WHERE c1 IN (SELECT c1 FROM tiny)

Flattened

SELECT huge.* FROM huge, tiny WHERE huge.c1 = tiny.c1

The Data Integration Service can flatten a correlated subquery when it meets the following requirements:
•

The type is IN or a quantified comparison.

•

It is not within an OR operator or part of a SELECT list.

•

It does not contain the LIMIT keyword.

•

It does not contain a GROUP BY clause, aggregates in a SELECT list, or an EXIST or NOT IN logical
operator.

•

It generates unique results. One column in the corelated subquery is a primary key. For example, if
r_regionkey column is a primary key for the vs.nation virtual table, you can issue the following query:

SQL Syntax to Generate a Mapping

103

SELECT * FROM vs.nation WHERE n_regionkey IN (SELECT b.r_regionkey FROM vs.region b WHERE
b.r_regionkey = n_regionkey).
•

If it contains a FROM list, each table in the FROM list is a virtual table in the SQL data service.

Function Support in Queries that Generate a Mapping
Informatica supports functions that meet the ANSI SQL-92 standard.
In addition, some functions have specific syntax requirements.
The following table lists the functions and supported syntax:
Function

Syntax

DATE( )

To specify the format of a date:
DATE(format '')
where  is a standard date format.
Example:
SELECT

POSITION( )

DATE(format 'dd-mm-yyyy') from table

To determine the position of a substring in a literal string:
POSTITION('', ''
)
Example:
POSITION('MA', 'James Martin')
To determine the position of a substring in a table column:
POSTITION('', 
)
Example:
POSITION('MA', FULL_NAME)

Generate a Mapping from an SQL Query with an Unsupported
Function
When the Developer tool generates a mapping from SQL, it validates the functions in the query. Use ANSIcompliant SQL to ensure valid mapping generation.
If the Developer tool encounters an unknown function in a valid SQL statement, it might generate a mapping
that contains a transformation labeled FIX_ME or an expression labeled FIX_EXPR. Edit these objects to fix
the mapping and get valid results. Unknown functions appear as a warning message in the mapping log file.
For example, you use the following SQL statement to generate a mapping:
SELECT unknownFunctionABC(c_custkey,c_comment) from customer
The following image shows how the mapping generated from this SQL statement includes an Expression
transformation that requires fixing:

104

Chapter 5: Generate a Mapping from an SQL Query

Notice that the Expression transformation is marked with an error icon. Use the Ports tab to edit the
erroneous expression. The mapping is not valid until you correct the error.

INSERT, UPDATE and DELETE Syntax
Use the following syntax to create valid INSERT, UPDATE and DELETE statements:
•

Use the following syntax for an INSERT statement:
INSERT INTO  []