Quick Fatigue Tool User Guide

User Manual: Pdf

Open the PDF directly: View PDF PDF.
Page Count: 266 [warning: Documents this large are best viewed by clicking the View PDF Link!]

1
QUICK FATIGUE TOOL FOR MATLAB®
Multiaxial Fatigue Analysis Code for Finite Element Models
User Guide
© Louis Vallance 2018
2
3
Version Information
Documentation revision: 60 [17/01/2018]
Concurrent code release: 6.11-10
Acknowledgements
Quick Fatigue Tool is a free, independent multiaxial fatigue analysis project. The author would like to
acknowledge the following people for making this work possible:
Dr.-Ing. Anders Winkler, SPE
Senior Technical Specialist
SIMULIA Nordics
Sweden
Technical Advice and collaboration
Fatigue materials data
Giovanni Morais Teixeira
Durability Technology Senior Manager
SIMULIA UK
United Kingdom
Technical advice and collaboration
Eli Billauer
Freelance Electrical Engineer
Isreal
Providing the code for the peak-valley
detection algorithm
Adam Nieslony
Professor of Mechanical Engineering
Opole University of Technology
Poland
Providing the code for the alternative
peak-picking method
Joni Keski-Rahkonen
Senior R&D Engineer
Rolls-Royce Oy Ab
Finland
Providing assistance with the critical
plane code
Bruno Luong
Providing the code for Cardan’s formula
which computes Eigenvalues for
multidimensional tensor arrays
4
Contents
1. Introduction .................................................................................................................................... 7
1.1 Overview ................................................................................................................................. 7
1.2 The stress-life methodology ................................................................................................... 8
1.3 The strain-life methodology .................................................................................................... 8
1.4 Why fatigue from FEA? ........................................................................................................... 9
1.5 Overview of syntax ................................................................................................................ 12
1.6 Required toolboxes ............................................................................................................... 13
1.7 Limitations............................................................................................................................. 14
1.8 Additional notes .................................................................................................................... 16
2. Getting started .............................................................................................................................. 17
2.1 Preparing the application ...................................................................................................... 17
2.2 How the application handles variables ................................................................................. 17
2.3 File structure ......................................................................................................................... 18
2.4 Configuring and running an analysis ..................................................................................... 19
2.5 The analysis method ............................................................................................................. 28
3. Defining fatigue loadings .............................................................................................................. 29
3.1 Loading methods ................................................................................................................... 29
3.2 Creating a stress dataset file ................................................................................................. 35
3.3 Creating a load history .......................................................................................................... 39
3.4 Load modulation ................................................................................................................... 42
3.5 High frequency loadings ....................................................................................................... 43
3.6 The dataset processor ........................................................................................................... 47
4. Analysis techniques ....................................................................................................................... 50
4.1 Background ........................................................................................................................... 50
4.2 Treatment of nonlinearity ..................................................................................................... 50
4.3 Surface finish and notch sensitivity ...................................................................................... 52
4.4 In-plane residual stress ......................................................................................................... 65
4.5 Analysis speed control .......................................................................................................... 66
4.6 Analysis groups ..................................................................................................................... 78
4.7 S-N knock-down factors ........................................................................................................ 87
4.8 Analysis continuation techniques ......................................................................................... 92
4.9 Virtual strain gauges ............................................................................................................. 97
5
5. Materials ..................................................................................................................................... 101
5.1 Background ......................................................................................................................... 101
5.2 Material databases ............................................................................................................. 102
5.3 Using material data for analysis .......................................................................................... 103
5.4 Creating materials using the Material Manager ................................................................. 104
5.5 Creating materials from a text file ...................................................................................... 105
5.6 General material properties ............................................................................................... 109
5.7 Fatigue material properties ................................................................................................ 110
5.8 Composite material properties ........................................................................................... 114
5.9 Estimation techniques ........................................................................................................ 118
6. Analysis algorithms ..................................................................................................................... 120
6.1 Background ......................................................................................................................... 120
6.2 Stress-based Brown-Miller .................................................................................................. 121
6.3 Normal Stress ...................................................................................................................... 125
6.4 Findley’s Method ................................................................................................................ 127
6.5 Stress Invariant Parameter ................................................................................................. 134
6.6 BS 7608 Fatigue of Welded Steel Joints .............................................................................. 138
6.7 NASALIFE ............................................................................................................................. 145
6.8 Uniaxial Stress-Life .............................................................................................................. 151
6.9 Uniaxial Strain-Life .............................................................................................................. 152
6.10 User-defined algorithms ..................................................................................................... 153
7. Mean stress corrections .............................................................................................................. 157
7.1 Background ......................................................................................................................... 157
7.2 Goodman ............................................................................................................................ 159
7.3 Soderberg ............................................................................................................................ 162
7.4 Gerber ................................................................................................................................. 163
7.5 Morrow ............................................................................................................................... 164
7.6 Smith-Watson-Topper......................................................................................................... 165
7.7 Walker ................................................................................................................................. 166
7.8 R-ratio S-N curves................................................................................................................ 169
7.9 User-defined mean stress corrections ................................................................................ 171
8. Safety factor analysis .................................................................................................................. 174
8.1 Background ......................................................................................................................... 174
8.2 Fatigue Reserve Factor ........................................................................................................ 174
6
8.3 Factor of Strength ............................................................................................................... 184
9. Job and environment files ........................................................................................................... 193
10. Output ..................................................................................................................................... 194
10.1 Background ......................................................................................................................... 194
10.2 Output variables.................................................................................................................. 196
10.3 Viewing output .................................................................................................................... 204
10.4 The ODB Interface ............................................................................................................... 206
11. FEA Modelling techniques ...................................................................................................... 220
11.1 Background ......................................................................................................................... 220
11.2 Preparing an FE model for fatigue analysis ......................................................................... 220
12. Supplementary analysis procedures ....................................................................................... 225
12.1 Background ......................................................................................................................... 225
12.2 Yield criteria ........................................................................................................................ 225
12.3 Composite failure criteria ................................................................................................... 227
13. Tutorial A: Analysis of a welded plate with Abaqus ................................................................ 242
13.1 Background ......................................................................................................................... 242
13.2 Preparing the RPT file ......................................................................................................... 243
13.3 Running the analysis ........................................................................................................... 243
13.4 Post processing the results ................................................................................................. 244
14. Tutorial B: Complex loading of an exhaust manifold .............................................................. 247
14.1 Background ......................................................................................................................... 247
14.2 Preparation ......................................................................................................................... 249
14.3 Defining the material .......................................................................................................... 250
14.4 Running the first analysis .................................................................................................... 251
14.5 Viewing the results with Abaqus/Viewer ............................................................................ 253
14.6 Running the second analysis ............................................................................................... 254
14.7 Post processing the results ................................................................................................. 256
Appendix I. Fatigue analysis techniques ........................................................................................ 260
Appendix II. Materials data generation .......................................................................................... 261
Appendix III. Gauge fatigue toolbox ............................................................................................. 262
References .......................................................................................................................................... 263
7
1. Introduction
1.1 Overview
Quick Fatigue Tool for MATLAB is an experimental fatigue analysis code. The application includes:
A general stress-life and strain-life fatigue analysis framework, configured via a text-based
interface;
Material Manager, a material database and MATLAB application which allows the user to
create and store materials for fatigue analysis (Section 5);
Multiaxial Gauge Fatigue, a strain-life code and MATLAB application which allows the user to
perform fatigue analysis from measured strain gauge histories
(document Quick Fatigue Tool Appendices: A3);
Export Tool, an ODB interface which allows the user to export fatigue results to an
Output Database (.odb) file for visualization in SIMULIA Abaqus/Viewer (Section 10.4); and
Supplementary analysis tools for static failure assessment (Section 12).
Quick Fatigue Tool runs entirely within the MATLAB environment, making it a highly customizable
code which is free from external dependencies.
The general analysis framework allows the user to analyse elastic stresses from Finite Element Analysis
(FEA) results. One of the main advantages of calculating fatigue lives from FEA is that it eliminates the
requirement to manually compute stress concentration and notch sensitivity factors. The program is
optimised for reading field output from Abaqus field report (.rpt) files. However, field output can be
specified in any ASCII format provided the data structure in Section 3 is observed.
A Quick Fatigue Tool analysis requires the following inputs from the user:
1. A material definition
2. A loading definition consisting of:
a. Stress datasets
b. Load histories
The above input is specified by means of a job file. This is an .m script or text file containing a list of
options which completely define the analysis. Analyses are performed by running the job file. Basic
fatigue result output is written to the command window, and extensive output is written to a set of
individual data files.
8
1.2 The stress-life methodology
The stress-life methodology is used for calculating fatigue damage where the expected lives are long
and the stresses are elastic. The method is also well-suited to infinite life design where a pass/fail
criterion based on the fatigue limit is sufficient. The stress-life approach ignores local plasticity and
provides a “total life” estimate of fatigue life [1] [2] [3]. This is illustrated by Figure 1.1. If the analyst
wishes to gain insights into the life up to crack initiation (), or wishes to find the number of cycles
required to cause crack growth (), strain and fracture mechanics-based methods should be
explored instead [4].
1.3 The strain-life methodology
The strain-life methodology is used for calculating fatigue damage where the cycles are dominated by
local plasticity. Although the majority of engineering structures are designed such that the operational
stresses do not exceed the elastic limit, unavoidable design features such as notches can result in local
plastic strains.
The strain-life methodology correlates the local plastic deformation in the vicinity of a stress
concentration to the far-field elastic stresses and strains using the constitutive response determined
from displacement-controlled fatigue tests on simple (smooth) laboratory specimens.
Fatigue analysis using the strain-life methodology is capable of accurate predictions of crack initiation
down to a few hundred cycles. Depending on the strain-life data, failure is usually assumed as a surface
crack with a length of approximately .
Figure 1.1: Illustration of the stress-life method where the total life, Nf, is
the sum of the life to crack initiation, Ni, and life to final crack propagation,
Np.
9
1.4 Why fatigue from FEA?
Modern design workflows demand a complex and multidisciplinary mind set from the analyst [5]. The
combination of complex geometry and service loading can make the determination of the most
important stresses an insurmountable task in the absence of powerful computer software.
The finite element method is a popular tool which allows the analyst to determine the stresses acting
on a component with a high degree of accuracy. However, selection of the correct stress is often still
not obvious. Take Figure 1.2 as an example.
A simple fillet joint is loaded in bending by a unidirectional pressure force. The load is applied to
 and then removed, resulting in a single pulsating loading event. Figure 1.3 shows the result of
the finite element analysis. The simplest way to relate the stress to fatigue life is by the Wöhler stress-
life curve [6]:
The damage parameter, , is related to the fatigue life in repeats, by the material constants
and . Considering the results from Figure 1.3, it is not obvious which stress should be chosen to take
the place of the parameter . There are several approaches for the evaluation of the fatigue life.
Figure 1.2: Uniaxial load applied to a fillet joint
10
A common approach is to take the node with the maximum principal stress and substitute this value
into the stress-life equation. Alternatively, the model can be viewed in terms of effective stress (for
example, von Mises), and using this parameter for the fatigue calculation. Both of these approaches
have serious drawbacks in that they do not correctly account for the presence of multiaxiality and
non-proportionality which commonly arises in fatigue loadings. Fatigue results obtained using these
techniques can be in significant error and even miss the location of fatigue crack initiation.
The best practise is to employ multiaxial algorithms which correctly identify the stresses on the most
damaging planes. Even unidirectional loads, such as those in the above example, can result in
multiaxial stress fields. Therefore, multiaxial analysis algorithms are always recommended over
uniaxial and effective stress methods.
The fillet joint is analysed using Quick Fatigue Tool with several fatigue criteria, the results of which
are summarised in Figure 1.4 and the tabulated data. Algorithms with “(CP)” indicate that they are
multiaxial (critical plane) methods.
The Uniaxial Stress-Life method underestimates the fatigue life, whereas the von Mises stress
overestimates. By considering the maximum principal stress, the uniaxial method assumes that fatigue
failure will occur on a plane perpendicular to the material surface where the shear stress is zero. In
reality, most metals experience crack initiation on shear planes where there is no normal stress. For
this reason, both the uniaxial and normal stress methods produce highly conservative life predictions.
The Stress-based Brown-Miller and Findley’s Method produce the most accurate results, since they
consider the action of both the normal and shear stress acting on several planes.
Figure 1.3: FEA stresses on the fillet joint due to bending load
11
Analysis algorithm
Fatigue life (repeats)
Uniaxial Stress-Life
462,000
Stress Invariant Parameter (von Mises)
1,570,000
Normal Stress (CP)
263,000
Stress-based Brown-Miller (CP)
800,000
Findley’s Method (CP)
1,040,000
By combining results from FEA with a multiaxial analysis technique, the most accurate life prediction
can be obtained. Due to the fact that multiple planes have to be searched in order to take into account
multiaxial stress states, the multiaxial algorithms are very time-consuming compared to the uniaxial
and effective stress methods. Confining the analysis to the location of maximum stress is not
guaranteed to be successful since the location of crack initiation is not guaranteed to coincide with
the location of maximum stress.
Figure 1.4: Fatigue analysis results showing (logarithmic) life using the
Stress-based Brown-Miller multiaxial algorithm
12
1.5 Overview of syntax
1.5.1 Overview
The Quick Fatigue Tool analysis job is created by a combination of job file options and environment
variables. Job file options represent the fundamental aspects of the fatigue analysis, such as material
definition, loading and fatigue analysis algorithm. Job file options are analysis-specific. Environment
variables are used to control the general behavior of Quick Fatigue Tool, such as load gating, critical
plane search precision and results format. Environment variables apply globally to all analyses, but
may be configured for specific analyses.
Detailed information on job file options and environment variables can be found in the document
Quick Fatigue Tool User Settings Reference Guide.
1.5.2 Job file options
All of the available job file options can be found in Project\job\template_job.m. These are standard
MATLAB variables which are passed into the main analysis function when the job file is run.
Job file options can be defined as character arrays or cells, as numeric arrays, or simply left empty,
depending on how the variable is defined.
1.5.3 Environment variables
All of the available environment variables can be found in Application_Files\default\environment.m.
These are MATLAB %APPDATA% variables which are read into the program application data at the
beginning of the analysis.
Environment variables can be defined as character arrays or cells, as numeric arrays, or simply left
empty, depending on how the variable is defined.
Environment variables are set using the setappdata() method. The first argument is the name of the
environment variable and the second argument is the value of the variable.
1.5.4 Units
Material data must be defined in the SI (mm) system of units (, , , ). Stress datasets
can use any system of units, and are automatically converted into  before the analysis.
1.5.5 Documentation conventions
The following conventions are used to signify job file options and environment variables throughout
the Quick Fatigue Tool documentation:
job file options defined in MATLAB are presented in BOLDFACE;
job file options defined in text files are preceded by an asterisk ( * );
environment variables are presented in magenta;
file names and string parameters are presented in 'magenta' or 'magenta';
command line parameters are presented in magenta;
default parameters are underlined ( );
Items enclosed in bold square brackets ([ ]) are optional;
items appearing in a list separated by bars ( | ) are mutually exclusive; and
one value must be selected from a list of values enclosed by bold curly brackets ({ }).
13
1.6 Required toolboxes
Quick Fatigue Tool does not require any MATLAB toolboxes to function properly. However, certain
toolboxes are used to enhance the functionality of the code if Quick Fatigue Tool detects that they are
installed. Below is a summary of these toolboxes and the additional functionality they provide.
Toolbox
Functionality
Image Processing Toolbox
Strain gauge preview dialogue box for
Gauge Fatigue Toolbox apps. See Appendix III of
Quick Fatigue Tool Appendices for more
information
Symbolic Math Toolbox
Iterative solver for the following features:
Derivation of the normal stress
sensitivity constant () using the
General Formula option in the
Material Manager app
Derivation of the reference strains from
user-defined strain gauge orientations
using the Multiaxial Gauge Fatigue and
Rosette Analysis apps
Derivation of the initial fibre
misalignment angle () for the LaRC05
composite damage initiation criteria
Statistics Toolbox
RHIST and RC output variables
Signal Processing Toolbox
High frequency loadings with HF_DATASET and
HF_HISTORY
14
1.7 Limitations
1.7.1 Fatigue from FEA
FEA tools
Quick Fatigue Tool has been optimised to use stress data from an Abaqus output database (.odb) file
using the built-in field output report tool in Abaqus/CAE. Stress dataset files generated from other
third party FEA processors must adhere to the rules and conventions detailed in Section 3.2.
Datasets
Quick Fatigue Tool assumes that the FEA stress datasets are elastic for both stress and strain-based
fatigue analyses. In the latter case, the stresses are automatically corrected for the effect of plasticity.
Elements
Special-purpose elements and connector elements are not supported. Structural elements such as
beams, pipes and wires may be used, although they have not been thoroughly tested. Using any of
these elements may cause the analysis to produce error messages or crash.
Stress tensors read from FE models must use a Cartesian coordinate system.
If the model contains plane stress elements from an Abaqus .odb file, only the valid tensor
components are printed to the dataset (.rpt) file. In such cases, the user must set the option
PLANE_STRESS=1.0 in the job file. This ensures that Quick Fatigue Tool is able to correctly identify
plane stress elements.
When exporting stress datasets from Abaqus, Quick Fatigue Tool will fail to process the data if the
element stresses are written to more than two locations on the element. For example, Abaqus usually
writes stresses to two section points for shell elements (top and bottom faces). If the user requested
field output at more than two section points, or the element is defined from a composite section,
Quick Fatigue Tool will not be able to interpret the stress dataset file.
If the user wishes to analyse the surface of a composite structure, the best practise is to define a skin
on the surface of the structure and export the stresses from the skin elements only.
Part instances
Quick Fatigue Tool supports stress datasets which span multiple Abaqus part instances provided that
the element-node numbers are unique between instances. If the stress dataset contains duplicate
element-node numbers, Quick Fatigue Tool is unable to distinguish between individual part instances
and results may be reported at incorrect locations.
If the model is defined as assemblies of part instances and the fatigue analysis spans more than one
instance, the user can ensure unique element-node numbering as follows:
1. Enter the Mesh Module in Abaqus/CAE and select Part as the object
2. From the main menu, select Mesh → Global Numbering Control…
3. Specify a start label for the elements and nodes of the current part instance, such that they
do not clash with other part instances which will be included for fatigue analysis
15
It is strongly recommended that the user works with flat input files. This generates an output database
containing a single part instance, which guarantees unique element-node numbering:
1. (any module in Abaqus/CAE): From the main menu select Model → Edit Attributes Model
name
2. Select Do not use parts and assemblies in input files
1.7.2 Loading
Quick Fatigue Tool does not directly support multiple block loading. A workaround involves splitting
the load spectrum into several jobs using the CONTINUE_FROM option, which allows
Quick Fatigue Tool to automatically overlay the fatigue damage onto the previous results to give the
total damage due to all blocks. Analysis continuation is discussed in Section 4.8.
1.7.3 Materials
It is assumed that stress relaxation does not occur during the loading and that the material is cyclically
stable. This expedites the fatigue calculation by allowing analysis of each node without considering
the effects of neighbouring nodes, but precludes the effect of global plasticity being accurately taken
into account. However, for the majority of cases this should not be an issue since the stress-life
method is elastic, and the strain-life method is intended for components experiencing relatively small
amounts of local plasticity.
Quick Fatigue Tool is applicable to metals and some engineering plastics where the stresses and
temperatures are sufficiently low that viscoelastic effects are negligible.
Treatment of local notch plasticity requires the use of strain-based fatigue methods. Treatment of
crack propagation requires the use of crack growth methods such as VCCT, CTOD and Paris Law LCF.
Analysis of viscoelastic, hyperelastic, anisotropic and quasi-brittle materials is not supported.
Materials whose fatigue behavior cannot reasonably be modelled by linear elastic stresses and stress-
life curves are not supported.
1.7.4 Performance
The MATLAB programming language is very convenient in terms of the ease and speed of development
it offers. However, in runtime the code is slow in comparison to other languages. Therefore, stress
datasets from even a modest finite element model can result in cumbersome analyses. The user is
advised to consult Section 10: Modelling considerations for assistance on how to minimise analysis
time without compromising on the accuracy of the solution.
16
1.7.5 GUI appearance
It is recommended that you set your monitor DPI scaling to  and the resolution to x.
On Windows 7, the DPI settings are found at Control Panel\Appearance and Personalization\Display.
On Windows 10 the settings are at the same location, but you must select set a custom scaling level
under the “Change size of items” section.
If the above settings are not used, Material Manager, Export Tool and the Gauge Fatigue Toolbox apps
may display incorrectly.
An alternative to using the Material Manager app is to import material data from a text file. For
instructions on creating material text files, consult Section 5.6 “Creating a material from a text file”.
1.7.6 Validation
Quick Fatigue Tool has not been validated against any official standard. The author does not take any
responsibility for the accuracy or reliability of the code. Fatigue analysis results should be treated as
supplementary and further investigation is strongly recommended.
1.8 Additional notes
a) It is recommended that you consult the file README.txt in the Documentation directory
before proceeding to the next section of the guide
b) Modifying the file structure (e.g. renaming folders) may prevent the program from working.
c) The change log for the latest version can be found here
d) Quick Fatigue Tool is free for distribution without license, provided that the author
information is retained in each source file
To report a bug or to request an enhancement, please contact the author:
Louis VALLANCE
louisvallance@hotmail.co.uk
17
2. Getting started
2.1 Preparing the application
Preparing Quick Fatigue Tool requires minimal intervention from the user, although it is important to
follow a few simple steps before running an analysis:
Make sure the working directory is set to the root of the Quick Fatigue Tool directory, e.g.
\..\Quick Fatigue Tool\6.x-yy. The directory structure is shown in Figure 2.1. All folders and sub folders
should be added to the MATLAB search path using the function addpath(), or by selecting the folders
Application_Files and Project, right-clicking and selecting Add to Path → Selected Folders and
Subfolders.
If the MATLAB working directory is not configured exactly as described above (e.g. the user enters the
job directory before running a job file), the application will not run.
Before running a fatigue analysis, it is recommended that you first run the job tutorial_intro. This is
because the initial run of a MATLAB function requires some additional computational overhead which
slows down the first analysis.
2.2 How the application handles variables
Quick Fatigue Tool does not store variables in the base workspace, nor does it modify or delete existing
workspace variables. During analysis, variables are stored either in the function workspaces or as
application-defined data using the setappdata() and getappdata() methods.
The application data is utilised for convenience, since variables can easily be accessed throughout the
code without having to pass variables between many functions. In order to prevent unwarranted data
loss, Quick Fatigue Tool does not modify existing application data by default. However, this means
that variables from previous analyses will remain in the application data and could cause unexpected
behavior in subsequent analyses, such as incorrect fatigue results and spurious crashes.
Figure 2.1: Quick Fatigue Tool file structure
18
In order to eliminate the possibility of such conflicts, the user is strongly advised to restart MATLAB
between each analysis so that the application data is cleared. If the user does not wish to restart
MATLAB for each analysis and is not concerned about Quick Fatigue Tool modifying the application
data, the following environment variable may be set with a value of 3.0:
Environment file usage:
Variable
Value
cleanAppData
{1.0 | 2.0 | 3.0 | 4.0}
This ensures that the application data is completely cleared before and after each analysis. This has
the same effect of restarting MATLAB.
The environment file and all of the available user settings are discussed in the document
Quick Fatigue Tool User Settings Reference Guide.
2.3 File structure
Quick Fatigue Tool separates various components of the code into folders. Below is a brief description
of what each folder contains:
Application_Files
Source code and application-specific settings.
There is usually no need to modify the contents
of this directory
Data
User-specific data (models, surface finish curves,
material data, etc.)
Documentation
README file and User Guide
input
Required location for stress datasets and load
histories
job
Job files defining each analysis
output
Fatigue results directory. If this folder does not
exist, it will automatically be created during the
analysis
19
2.4 Configuring and running an analysis
2.4.1 Configuring a standard analysis
Standard analyses are configured and submitted from an .m file.
In this example, a simple fatigue analysis is configured by combining a stress dataset with a load
history. The files can be found in the Data\tutorials\intro_tutorial folder.
1. Define a stress dataset: Open the file stress_uni.dat. A simple stress definition consists of six
components defining the Cauchy stress tensor. The components are defined in the following
order:
Column 1
Column 2
Column 3
Column 4
Column 5
Column 6






The file stress_uni.dat contains a stress tensor at a single material point in a state of uniaxial tension
().
2. Define a load history: Open the file history_fully_reversed.dat. A simple load history consists of
two loading points. Below is a list of common load definitions:
Load type
Definition
Fully-reversed (push-pull)
[1, -1]
Pure tension
[0, 1]
Pure compression
[0, -1]
The file history_fully_reversed.dat defines the fully-reversed loading event:
().
3. Study the job file: Open the job file tutorial_intro.m. The file contains a set of options specifying
all the information necessary for fatigue analysis. Options can be strings or numeric depending on
the meaning of the option. Not all options require a user setting. Below is a summary of each
option in the job file. The user need not worry about the number of definitions; all of the job file
options are explained in the document Quick Fatigue Tool User Settings Reference Guide and in
the tutorials later in this guide.
Option
Meaning
Additional notes
JOB_NAME
The name of
the job
JOB_DESCRIPTION
A description
of the job
The job name and description are
printed to the log file for reference
CONTINUE_FROM
Superimpose
results onto a
previous job
This feature is useful for block loading,
or specifying the analysis algorithm
based on model regions
20
DATA_CHECK
Runs the job
up to the
beginning of
the analysis
Useful for checking the message file
for initial notes and warnings, without
having to run the full fatigue analysis
MATERIAL
Material used
for analysis
'SAE-950C.mat' references a file
containing the material properties.
Materials are stored in
Data\material\local. Materials are
defined using the Material Manager
app. Usage of the app is discussed in
Section 5
USE_SN
Stress-life data
1.0; A flag indicating that S-N data
should be used if available
SN_SCALE
Stress-life data
scale factor
1.0; A linear scale factor applied to
each S-N data point
SN_KNOCK_DOWN
S-N knock-
down factors
Knock-down factors are not used in
this analysis. Knock-down factors are
discussed in Section 4.8
DATASET
Stress data
'stress_uni.dat' references the stress
dataset file. Stress datasets should be
saved in Project\input
HISTORY
Load history
'history_fully_reversed.dat'
references the load history file. Load
histories should be saved in
Project\input
UNITS
Stress units
3.0; A flag with the definition of MPa
CONV
Conversion
factor for
stress units
LOAD_EQ
Load
equivalency
The default loading equivalence is 1
repeat. If the loading represents
another dimension, the fatigue results
can be expressed in a more
appropriate unit (e.g. 1000 hours)
SCALE
Stress scale
0.8285; A linear scale factor applied
to the entire loading
OFFSET
Offset value
for stress
history
Loading offsets are discussed in
Section 3.4
REPEATS
Number of
repetitions of
the loading
HF_DATASET
Dataset(s) for
high frequency
loads
HF_HISTORY
Load history
for high
frequency
loads
21
HF_TIME
Time
compression
for high
frequency
loads
HF_SCALE
Scale factor for
high frequency
loads
High frequency loads are discussed in
Section 3.5
PLANE_STRESS
Element type
(3D stress or
planar)
0.0; A flag indicating that a 3D
element type should be assumed. The
distinction that Quick Fatigue Tool
makes about element types is
discussed in Sections 3.2.4 and 3.6.1
OUTPUT_DATABASE
Model output
database
(.odb) file from
an Abaqus FE
analysis
EXPLICIT_FEA
FEA procedure
type
PART_INSTANCE
FEA part
instance name
STEP_NAME
FEA step name
RESULT_POSITION
FEA result
position
Associating a job with an Abaqus .odb
file is discussed in Sections 4.6 and
9.5. This job is not associated with an
.odb file
ALGORITHM
Analysis
algorithm
0.0; A flag indicating that the default
analysis algorithm should be used
(Stress-based Brown-Miller). Analysis
algorithms are discussed in Section 6
MS_CORRECTION
Mean stress
correction
2.0; A flag indicating that the
Goodman mean stress correction will
be used. Mean stress corrections are
discussed in Section 7
ITEMS
List of items
for analysis
'ALL' indicates that all items in the
model (1) should be analysed.
Selecting analysis items is discussed in
Section 4.5.3
DESIGN_LIFE
The target life
of the system
'CAEL' indicates that the target life
should be set to the material’s
constant amplitude endurance limit
KT_DEF
Surface finish
definition
KT_CURVE
Surface finish
type
This analysis assumes a surface finish
factor of 1. Surface finish definition is
discussed in Section 4.3
22
NOTCH_CONSTANT
Notch
sensitivity
constant
NOTCH_RADIUS
Notch root
radius
GAUGE_LOCATION
Virtual strain
gauge
definition
GAUGE_ORIENTATION
Virtual strain
gauge
orientation
RESIDUAL
Residual stress
0.0; A residual stress value which is
added to the fatigue cycle during the
damage calculation. Residual stress is
discussed in Section 4.4
FACTOR_OF_STRENGTH
Factor of
strength
calculation
0.0; A flag indicating that a factor of
strength calculation will not be
performed. Factor of strength is
discussed in Section 8.3
FATIGUE_RESERVE_FACTOR
2.0; A flag indicating that the
Goodman B envelope will be used for
Fatigue Reserve Factor calculations.
The Fatigue Reserve Factor is
discussed in Section 8.2
HOTSPOT
Hotspot
calculation
0.0; A flag indicating that a hotspot
calculation will not be performed.
Factor of strength is discussed in
Section 4.5.3
OUTPUT_FIELD
Request for
field output
1.0; A flag indicating that field output
will not be written
OUTPUT_HISTORY
Request for
history output
0.0; A flag indicating that history
output will not be written
OUTPUT_FIGURE
Request for
MATLAB
figures
0.0; A flag indicating that MATLAB
figures will not be written. Analysis
output is discussed in Section 10
WELD_CLASS
Weld
classification
for BS 7608
analysis
YIELD_STRENGTH
Yield strength
for BS 7608
analyses
UTS
Ultimate
tensile
strength for BS
7608 analyses
DEVIATIONS_BELOW_MEAN
Degree of
uncertainty for
BS7608
analyses
23
FAILURE_MODE
Failure mode
for BS 7608
analyses
CHARACTERISTIC_LENGTH
Characteristic
dimension for
BS 7608
analyses
SEA_WATER
Environmental
effects factor
for BS 7608
analyses
This analysis does not require a weld
definition. The BS 7608 algorithm is
discussed in Section 6.6
4. Select the material and analysis type: This analysis uses SAE-950C Manten steel as the material.
The materials available for analysis are located in the Data\material folder. Stress units are in MPa.
The analysis algorithm is the default algorithm (Stress-based Brown-Miller), the Goodman mean
stress correction is used, as well as user-defined stress-life data points.
5. Run the job: To execute the analysis, right-click on tutorial_intro.m and click run.
6. A summary of the analysis progress is written to the command window. When the analysis is
complete, the command window should look like that of Figure 2.1.2.
7. The fatigue results summary reports a life of 732 thousand cycles to failure at location 1.1. This is
the default location when a stress dataset is provided without position labels.
8. In the Project\output directory, a folder with the name of the job is created which contains all of
the requested output. In this analysis, extensive output was not requested, so only the following
three basic files are written:
Figure 2.1.2: Fatigue results summary
24
File
Contents
<job_name>.log
Input summary
Analysis groups
Critical plane summary
Factor of Strength diagnostics
Fatigue results summary
<job_name >.msg
Pre and post analysis messages
o Analysis-specific notes offering
useful information to the user
o Analysis-specific warnings
explaining potential issues
with the analysis
<job_name >.sta
An item-by-item summary of the
analysis progress
9. Open the log and status files and examine their contents. According to the command window
summary, the analysis completed with warnings. Examine the contents of the message file:
a. Extensive output was not requested by the user
b. The damage at design life (10 million cycles by default) is over unity, corresponding to
failure
c. There is a warning that Quick Fatigue Tool encountered an ambiguity whilst determining
the element (stress tensor) type for analysis. A 3D stress tensor was assumed as the input
stress dataset. Since this assumption is correct, the warning can be ignored
10. To run another analysis, it is recommended that you first restart MATLAB to ensure that all the
application data from the previous analysis is cleared.
25
2.4.2 Configuring a data check analysis
A data check runs the job file through the analysis pre-processor, without performing the fatigue
analysis.
Job file usage:
Option
Value
DATA_CHECK
{0.0 | 1.0}
Data checks are useful for ensuring that the analysis definitions are valid, allowing the user to correct
errors which may otherwise only become apparent after a long analysis run. The data check feature
checks the following for consistency:
ODB interface settings
Results directory initialization
Analysis continuation settings
Material definitions and S-N interpolation
Algorithm and mean stress correction settings
Dataset and history definitions
Principal stress histories
Custom mean stress, FRF and surface finish data
Surface detection
Yield criteria analysis
Composite criteria analysis
Nodal elimination
Load proportionality
Virtual strain gauge definition
Duplicate analysis item IDs
Pertinent information regarding the data check run can be found in the message file in
Project\output\<jobName>.
If the user requested field output from the job file, the worst tensor and principal stress per node for
the whole model are written to the files datacheck_tensor.dat and datacheck_principal.dat,
respectively.
26
2.4.3 Configuring an analysis from a text file
Quick Fatigue Tool includes a text file processor, which allows the user to submit a job from an ASCII
text file containing only the options which are required to define the analysis. This results in job files
which are less cumbersome than the standard .m file which must contain every option regardless of
whether or not it is required.
To define a job from a text file, options are specified as keywords.
Job file usage:
Option
Value
*<keyword> =
<value>
Keywords are exactly the same as job file options, but they always begin with an asterisk (). For
example, the job file option DATASET is declared in the text file as *DATASET. Entries which do not
begin with an asterisk, or are not proceeded with an equal sign () followed by a value, are ignored
by the input file processor.
Job file options containing underscores are specified in the text file with spaces. For example, the
option JOB_NAME is specified in the text file as *JOB NAME.
The following should be noted when defining jobs from a text file:
it is not necessary to end the definition with a semi-colon;
it is not necessary to enclose strings with apostrophes;
white spaces are ignored; and
mathematical expressions are not supported.
The user must adhere to the following syntax when defining cells in the text file.
Cell type
Text input
Strings
{<>, <>,…, <>}
Numeric arrays
{[, ,…, ], [, ,…, ],…, [, ,…,
]}
Mixture of strings and numeric arrays
{<>, [, ,…, ]}
Any combination of strings and numerical inputs are supported, provided each element is separated
by a comma.
27
Jobs defined as text files are submitted from the command line.
Command line usage:
>> job <jobFile>
>> job <jobFile> 'option'
The parameter option has two mutually-inclusive values:
interactive prints an echo of the message (.msg) file to the MATLAB command window.
datacheck submits the analysis job as a data check analysis.
Any file extension is accepted provided the contents is ASCII text. Job files with the extension .inp can
be specified without appending .inp on the command line. For all other file types, the extension must
be specified. Apostrophes are not required when specifying the input file name.
Example usage
An example of a text-based job file is given by the file tutorial_intro.inp in the
Data\tutorials\intro_tutorial folder. Open the file and study its contents.
There is a text header at the beginning of the file, which is distinguished from the rest of the contents
by double asterisks (**) at the beginning of each line. These lines are ignored by Quick Fatigue Tool.
The first keyword is *USER MATERIAL, which is used to define material data. Guidance on creating
material data in a text file is found in Section 5.5.4 “Specifying material properties in a job file”.
Subsequent keywords specify analysis definitions for a uniaxial stress-life analysis. To submit the file
for analysis, execute the following command:
Command line usage:
>> job tutorial_intro
The material data is first read into the material database. If the material already exists in the
Data\material\local directory, the user is prompted to overwrite the existing material data. The
analysis keywords are then processed and the job is submitted for analysis.
Results of the fatigue analysis are written to Project\output\tutorial_intro.
28
2.5 The analysis method
1. A loading definition is created by combining elastic stress datasets with load histories to produce
a scaled history of stresses for each item in the model
2. If a high frequency loading is provided, it is interpolated and superimposed onto the original load
history
3. If requested, the load histories are pre-gated before the analysis which aims to remove small
cycles from the loading
4. The principal stress history is calculated for each analysis item
5. The load history at each point in the model is assessed for proportionality. The critical plane step
size may automatically be increased if the load is considered to be proportional
6. If requested, nodal elimination is performed which removes analysis items whose maximum stress
range is less than the fatigue limit of the material
7. User stress-life data is interpolated to find the endurance curve for a fully-reversed cycle
8. Stresses are resolved onto planes in a spherical coordinate space to find the plane on which the
most damaging stresses occur
1
9. Stresses on this plane are counted using the rainflow cycle counting method
2
. If requested, the
stress tensors on this plane are gated prior to cycle counting
10. If requested, the stress cycles are corrected for material non-linearity
11. The stress cycles are corrected for the effect of mean stress
12. A damage calculation is performed for each cycle using Miner’s Rule of linear damage
accumulation [7]. The endurance limit may be reduced to 25% of its original value if the cycle
stress amplitudes are damaging
13. Steps 8-12 are repeated for each analysis item
14. If requested, the item with the worst life is analysed once more to calculate extensive output
15. If requested, Factor of Strength (FOS) iterations are performed. The damage is recalculated for
each analysis item to obtain the linear loading scale factor which, if applied to the original loading,
would result in the user-defined design life
1
Only if the fatigue analysis algorithm is multiaxial
2
Only if the load history contains more than two data points
29
3. Defining fatigue loadings
3.1 Loading methods
3.1.1 Overview
The loading definition forms the basis of the analysis, and describes the stress history at each point in
the model. Loadings usually consist of stress datasets (user-defined or from FEA) and load histories.
The stress datasets contain the static stress state at each location in the model. This can be at a node,
integration point, centroid or otherwise. The load history defines the variation of the stresses through
time. However, Quick Fatigue Tool does not distinguish elapsed time between loading points and
hence the load history is treated as being rate-independent.
Quick Fatigue Tool offers five methods for creating loading definitions:
1. Uniaxial history
2. Simple loading
3. Multiple load history (scale and combine) loading
4. Dataset sequences
5. Complex (block sequence) loading
3.1.2 Syntax
Stress datasets are specified as ASCII text files:
'dataset-file-name.*' | {'dataset-file-name-1.*', 'dataset-file-name-2.*',…, 'dataset-file-name-.*'}
Load histories are specified as ASCII text files, or directly as one or more vectors:
'load-history-file-name.*' | {'history-file-name-1.*', 'history-file-name-2.*',…, 'history-file-name-.*'}
[, ,…, ] | {, ,…, }
30
3.1.3 Uniaxial history
A single load history is supplied without a stress dataset.
Job file usage:
Option
Value
DATASET
' '
HISTORY
{'history-file-name.*' | [, ,…, ] }
The load history is analysed without respect to a particular model, and is only valid for uniaxial states
of stress. Uniaxial histories can only be used with the Uniaxial Stress-Life and Uniaxial Strain-Life
algorithms.
Job file usage:
Option
Value
ALGORITHM
{3.0 | 'UNIAXIAL STRAIN'}
ALGORITHM
{10.0 | 'UNIAXIAL STRESS'}
3.1.4 Simple Loading
A simple loading consists of a single stress dataset multiplied by a load history.
Job file usage:
Option
Value
DATASET
'dataset-file-name.*'
HISTORY
{'history-file-name.*' | [, ,…, ] }
An example of a simple fatigue loading is given by Figure 3.1.
Figure 3.1: Demonstration of a simple loading. The stresses, , due
to a unit load are multiplied by a load history
31
Constant amplitude loading
The load history is defined as a minimum and maximum value, an example of which is given below.
Job file usage:
Option
Value
HISTORY
[, ]
The definition can be a single cycle, or repetitions of the same cycle.
Job file usage:
Option
Value
REPEATS
Single load history
The load history is defined as a time history of the loading, scaled with the stress dataset. The dataset
can be defined in two ways:
1. FEA stress from unit load, scaled by a time history of load values
2. FEA stress from maximum load, scaled by a time history of load scale factors
Both methods are valid provided that the FEA is linear and elastic.
32
3.1.5 Multiple load history (scale and combine) loading
A multiple load history consists of several stress datasets multiplied by the same number of histories.
At each analysis item, the stress tensor from each dataset is scaled with its respective load history and
combined into a single stress history. The number of stress datasets and load histories must be the
same, although the number of history points in each load history need not be the same.
Job file usage:
Option
Value
DATASET
{'dataset-file-name-1.*', 'dataset-file-name-2.*',…,
'dataset-file-name-.*'}
HISTORY
{'history-file-name-1.*', 'history-file-name-2.*',…,
'history-file-name-.*'}
An example of a scale and combine loading is given by Figure 3.2.
Scale and combine loading is only physically meaningful for elastic stresses. Each channel can be scaled
by its respective load history since the load is directly proportional to the elastic FEA stress. The scale
and combine method assumes that each loading channel is occurring simultaneously.
The load history may be defined by any combination of load history files and vectors.
Job file usage:
Option
Value
HISTORY
{'history-file-name.*', [, ,…, ]}
Figure 3.2: Demonstration of a multiple load history (scale and combine). The stresses,
,
and due to unit loads , and are multiplied by their respective load histories and
summed to produce the resultant fatigue loading
33
3.1.6 Dataset sequences
A dataset sequence loading consists of several stress datasets.
Job file usage:
Option
Value
DATASET
{'dataset-file-name-1.*', 'dataset-file-name-2.*',…,
'dataset-file-name-.*'}
HISTORY
[ ]
An example of a dataset sequence loading is given by Figure 3.3.
Since the fatigue loading is completely described by the variation of stresses between each dataset,
specification of load histories is not required.
Figure 3.3: Demonstration of a stress dataset sequence loading. The fatigue loading is formed
by the sequence of stress solutions , and , due to the applied loads , and ,
respectively
34
3.1.7 Complex (block sequence) loading
Complex loadings consist of multiple loading blocks, each representing a stage of the component’s
duty. Block sequences are useful in proving ground tests where the component is subjected to a
sequence of distinct loading events, and it is desirable to compute the individual damage contributions
from each event.
Currently, sequences of loading blocks cannot be defined in a single job. However, Quick Fatigue Tool
allows the user to run multiple jobs in series, wherein each job is treated as a separate loading block.
Job file usage:
Option
Value
JOB_NAME
'current-job-name'
CONTINUE_FROM
'previous-job-name'
When a job is run as a continuation of a previous job, Quick Fatigue Tool calculates the individual
damage contribution of the current job, then superimposes the result onto the previous job to give
the cumulative damage of the block sequence.
Analysis continuation is discussed further in Section 4.8.
35
3.2 Creating a stress dataset file
3.2.1 Dataset structure
Stress datasets are text files containing a list of stress tensors. The simplest way to create a stress
dataset file is to specify the tensor components as follows:
Line 1:
, , , , , 
Line 2:
, , , , , 
.
.
.
.
.
.
Line
, , , , , 
Each line defines the stress tensor at each location in the model.
3.2.2 Creating a dataset from Abaqus/Viewer
Stress datasets may be generated from finite element analysis (FEA). To create a stress dataset file
from Abaqus/Viewer, complete the following steps:
1. In the Visualization module, from the main menu, select Result → Options…
2. Under “Averaging”, uncheck “Average element output at nodes”.
3. From the main menu, select Report → Field Output…
4. Under Step/Frame”, select the step and the frame in the analysis from which the stresses will
be written.
5. In the “Variable” tab, under “Output Variables”, select the position of the output (Integration
Point, Centroid, Element Nodal or Unique Nodal).
6. Expand the variable S: Stress components” and select all the available Cauchy tensor
variables (S11, S22, S33, S12, S13 and S23); if plane stress elements are used, select (S11, S22,
S33 and S12) and set PLANE_STRESS=1.0 in the job file; if one-dimensional elements are used,
select (S11)
7. If the model contains results at multiple section points or plies, select Settings… from the
section point method and specify the section point or ply of interest
8. In the “Setup” tab, set the file path (e.g. \6.x-xx\Project\input\<filename>.rpt)
9. Under “Data”, uncheck “Column totals” and “Column min/max”. Make sure “Field output” is
checked.
10. Click OK
36
3.2.3 Creating datasets from other FEA packages
If the user wishes to create a stress dataset from an FEA package other than Abaqus, the following
standard data format must be observed for 3D stress elements:
MAIN POSITION ID
(OPTIONAL)
SUB POSITION ID
(OPTIONAL)
S11
S22
S33
S12
S13
S23
For example, a particular stress dataset may look like the following:
The position labels can be arbitrary, but usually represent the location on the finite element model
(e.g. element.node). Quick Fatigue Tool will quote the position of the shortest fatigue life. Position
labels are not compulsory: The stress dataset can be specified with tensor information only.
Furthermore, it is not compulsory to include both main and sub IDs. For example, if the stress data is
unique nodal (nodal averaged), there is only one position ID which is the node number.
3.2.4 Creating datasets with different element types
Quick Fatigue Tool automatically recognises stress datasets from Abaqus containing multiple element
types. Such files are split into regions, each of which defines the stress tensors for a specific element
type. If the stress dataset file is user-defined, the following conventions must be observed:
3D stress elements:
MAIN POSITION ID
(OPTIONAL)
SUB POSITION ID
(OPTIONAL)
S11
S22
S33
S12
S13
S23
Plane stress elements without shell face information:
MAIN POSITION ID
(OPTIONAL)
SUB POSITION ID
(OPTIONAL)
S11
S22
S33
S12
Plane stress elements with shell face information:
MAIN POSITION ID
(OPTIONAL)
SUB POSITION ID
(OPTIONAL)
S11
(+ve face)
S11
(-ve face)
S22
(+ve face)
S22
(-ve face)
S33
(+ve face)
S33
(-ve face)
S12
(+ve face)
S12
(-ve face)
37
One-dimensional elements:
MAIN POSITION ID
(OPTIONAL)
SUB POSITION ID
(OPTIONAL)
S11
Below is an example of a user-defined stress dataset file containing two elements.
Each element region must be declared by a text header in order to be recognised, and the header
must start with a non-numeric character. In the above example, REGION_1 defines a 3D element and
REGION_2 defines a 2D element with results at both shell faces. Both elements are defined with
element-nodal (nodal un-averaged) position labels. All the datasets in the loading must be defined
with the same position labels otherwise the analysis will not run. To check whether or not the dataset
definition was processed correctly, Quick Fatigue Tool prints the number of detected regions to the
message file. This can be found in Project\output\<jobName>\<jobName>.msg.
Quick Fatigue Tool will automatically detect the element type based on the number of columns in the
dataset file. For example, if there are five columns, this will be interpreted as plane stress (four
columns define the tensor) with one column defining the element position (either unique nodal or
centroidal). If the dataset file contains six columns, this could either be interpreted as 3D stress (all six
columns define the stress tensor) with no position labels, or plane stress (four columns define the
tensor) with two columns defining the element position (either element-nodal or integration point).
This ambiguity is resolved with the use of the following job file option:
Job file usage:
Option
Value
PLANE_STRESS
{0.0 | 1.0}
If the value of PLANE_STRESS is set equal to 1.0, Quick Fatigue Tool will assume that the element
definition is plane stress if it encounters a data region with six data columns.
A complete description of how dataset files are interpreted is provided in Section 3.6.
38
3.2.5 Specifying stress dataset units
Since fatigue analysis uses the SI (mm) system of units, stress datasets are assumed to have units of
Megapascals () by default. If the datasets use a different system of units, these must be specified
in the job file so that Quick Fatigue Tool can convert them into the SI (mm) system.
Job file usage:
Option
Value
UNITS
{'Pa' | 'kPa' | 'MPa' | 'psi' | 'ksi' | 'Msi'}
If the stress dataset units are not listed, they can be user-defined.
Job file usage:
Option
Value
UNITS
0.0
CONV

The parameter  is a constant which converts the stress dataset units into Pascals ().

The stress data is converted into the SI (mm) system using Equation 3.2.

39
3.3 Creating a load history
Load histories can be defined in three ways:
1. From a text file
2. As a direct definition
3. As a workspace variable
Create a load history from a text file
If the load history is defined from a text file, it must contain a single or vector of loading
points, as follows:
← First loading point
.
.
← Last loading point
For example, a fully-reversed load history would look like that of Figure 3.4.
Job file usage:
Option
Value
HISTORY
'history-file-name.*'
Load history files must be stored in the Project\input folder in order for Quick Fatigue Tool to locate
the data.
Figure 3.4: Fully-reversed load history
40
Create a load history as a direct definition
Load histories can be specified in the job file as a vector of scale factors.
Job file usage:
Option
Value
HISTORY
[, ,, ]
Alternatively, the load history can be defined as a function.
Job file usage:
Option
Value
HISTORY

where is a stress amplitude scale factor.
Create a load history as a workspace variable
If the load history is defined as a workspace variable, it must be a or numerical array.
Job file usage:
Option
Value
HISTORY
{, ,, }
In addition, the variables declared in HISTORY must also be specified as inputs to the function
declaration and the function call.
Job file usage:
function [ ] = <jobName>(, ,, )
The job is then submitted by executing the job file from the command line.
Command line usage:
>> <jobName>(, ,, )
41
For scale and combine loadings, it is possible to define a load history using a combination of text files,
direct definitions and workspace variables.
Job file usage:
Option
Value
HISTORY
{'history-file-name.*', [, ], }
Treatment of multiple load histories
The load histories do not have to be the same length. Before the analysis, all the load histories will be
modified to have the same length by appending zeroes to the shorter histories. However, in order to
maximise the reliability and performance of the cycle counting algorithm, it is strongly recommended
that the loadings have a similar length.
Note that multiple load histories are not supported for uniaxial analysis.
42
3.4 Load modulation
The fatigue loading can be scaled and offset using the SCALE and OFFSET job file options, respectively.
The scaled and offset fatigue load, , is given by Equation 3.3.

Defining load scale factors
Load scale factors are defined as follows:
Job file usage:
Option
Value
SCALE
[, ,, ]
If the analysis is a scale and combine loading, is the number of dataset-history pairs; each load scale
factor is multiplied by its respective dataset-history pair. If the analysis is a dataset sequence, is the
number of datasets; each load scale factor is multiplied by its respective dataset in the sequence.
If the user specifies the Uniaxial Stress-Life algorithm, a single scale factor may be specified. Load
history scales can be used with any loading methods.
Defining load offset values
Load offset values are defined as follows:
Job file usage:
Option
Value
OFFSET
[, ,…, ]
where is the number of dataset-history pairs; each load offset value is summed with its respective
dataset-history pair.
Since load offset values are applied to the load history points only, they may not be used with dataset
sequence loadings. Load offsets may be used with all other loading methods.
43
3.5 High frequency loadings
3.5.1 Overview
The scale and combine method outlined in Section 3.1 does not distinguish between elapsed time and
can produce physically incorrect load histories if two load signals with very different period are
analysed together. The solution is to superimpose the higher frequency load data onto the lower
frequency data.
3.5.2 Defining high frequency loadings
Take the example of a piston which experiences combined thermal and mechanical stresses shown
in Figure 3.5.
For a scale and combine loading, the two signals are defined by the load histories in Figure 3.5.
Normalized thermal load
[0, 1, 0, -1, 0]
Normalized mechanical load
[0,1,0,0.4,-1,0,0,1,0,0.4,-1,0,0,1,0,0.4,-
1,0,0,1,0,0.4,-1,0,0,1,0,0.4,-1,0,0,1,0,0.4,-
1,0,0,1,0,0.4,-1,0,0,1,0,0.4,-1,0,0,1,0,0.4,-
1,0,0,1,0,0.4,-1,0]
Figure 3.5: Thermal and mechanical load signals occurring over the same
period
44
A problem arises if the two loads occur over the same time interval. Since the mechanical load has
many more time points than the thermal load, Quick Fatigue Tool will append most of the mechanical
load onto the end of the load history. Using a standard scale and combine, the resulting load history
would be that of Figure 3.6.
This loading definition is physically incorrect because it does not allow for the fact that the two loads
occur simultaneously over the same period. The solution is to define the mechanical load as a high
frequency dataset. The modified load histories are as follows:
Normalized thermal load
[0, 1, 0, -1, 0]
Normalized mechanical load
[0, 1, 0, 0.4, -1, 0]
In this case, the high frequency data is specified as a single repeat of the mechanical load.
Job file usage:
Option
Value
HF_DATASET
'mechanical-dataset-file-name.*'
HF_HISTORY
'mechanical-history-file-name.*'
HF_TIME
{100.0, 10.0}
Figure 3.6: Result of using the scale and combine technique for the
thermal-mechanical load
45
High frequency loading and load histories are specified in the same manner as standard datasets and
load histories. In order for Quick Fatigue Tool to correctly superimpose the high frequency dataset(s),
it must know the period for both loadings. In this example, the period of the low frequency data is 100
seconds and the period of a single repeat of the high frequency data is 10 seconds. This means that
the high frequency dataset will be superimposed 100/10 = 10 times into the low frequency data.
The resulting load history is shown in Figure 3.7.
Quick Fatigue Tool interpolates the thermal load so that it contains the correct number of data points
for the mechanical load to be superimposed, without resulting in trailing data. In using this technique,
the mechanical data is correctly represented as occurring over the same period as the thermal data.
The same loading methods apply as those outlined in Section 3.1.
3.5.3 Example usage
An example input file containing a high frequency loading definition can be found in
Data\tutorials\intro_tutorial\tutorial_high_frequency.inp.
The job is submitted by executing the following command:
>> job tutorial_high_frequency
Results are written to Project\output\ tutorial_high_frequency.
Figure 3.7: Result of using high frequency loading for the thermal-
mechanical load
46
3.5.4 Additional guidance
The user should take into account the following points when using high frequency loading:
high frequency loading requires the Signal Processing Toolbox to work;
high frequency loading should be used if two or more load histories occur over the same time
interval, where one or more of the load histories is a repetitive load at a much higher
frequency;
the number of analysis items in the high frequency loading must be the same as the number
of items in all other stress datasets. If specific items are listed in the job file, those same items
will be used in the high frequency loading;
if the original datasets contain stresses at shell faces, the high frequency data must also
contain shell face data;
the main load history should contain at least three data points, otherwise the high frequency
loading may not be interpolated properly. If the original load history contains only two data
points, a zero value will be appended to the end of the history;
when defining the high frequency load history, only a single cycle needs to be defined, along
with the period for that cycle. If the entire load history is provided, the resulting load history
will be incorrect;
if load history pre-gating is enabled, the original datasets may be modified prior to the high
frequency loading being added. This may result in an unexpected load history;
if the high frequency data is not in the form of a peak-valley sequence (the loading contains
intermediate data between turning points), this data will not be considered by the selected
gating criterion;
the units of the high and low frequency data must be the same; and
using high frequency loading can increase the analysis time dramatically.
47
3.6 The dataset processor
3.6.1 Determining the dataset type
Quick Fatigue Tool determines the type of dataset based on the number of columns in the dataset file.
The following table describes how datasets are processed.
Number of columns in dataset
Assumption
1
Element type: One-dimensional stress
Position: Unknown
2
Element type: One-dimensional stress
Position: Unique nodal or centroidal
3
Element type: One-dimensional stress
Position: Element-nodal or integration point
4
Element type: Plane stress
Position: Unknown
5
Element type: Plane stress
Position: Unique nodal or centroidal
6
IF PLANE_STRESS = 0.0 in the job file:
Element type: 3D stress
Position: Unknown
ELSEIF PLANE_STRESS = 1.0 in the job file:
Element type: Plane stress
Position: Element-nodal or integration point
7
Element type: 3D stress
Position: Unique nodal or centroidal
8
Element type: 3D stress
Position: Element-nodal or integration point
9
Element type: Plane stress with shell face data
Position: Unique nodal or centroidal
48
10
Element type: Plane stress with shell face data
Position: Element-nodal or integration point
> 10
Not applicable. Quick Fatigue Tool will exit with
an error.
3.6.2 Plane stress elements
If the stress datasets originate from an FE model consisting of plane stress elements, the .rpt file can
contain results on both faces, as shown by Figure 3.8.
Quick Fatigue Tool can read the stresses from either the positive or negative face. The default face is
defined as a variable in the environment file.
Environment file usage:
Variable
Value
shellLocation
{1.0 | 2.0}
1. Negative (SNEG) element face
2. Positive (SPOS) element face
Figure 3.8: Positive normals for three-dimensional
conventional shell elements
49
3.6.3 Multiple element groups
If the stress dataset file was written from an Abaqus output database, it is possible for the data to be
separated into multiple regions. This can happen if there are multiple part instances in the model, or
if the model contains a mixture of element types. In such cases, Quick Fatigue Tool will automatically
process each region and concatenate the stress tensors after all the datasets have been read.
Common examples of when the dataset can contain multiple regions are the analysis of surfaces with
in-plane residual stress and/or surface finish. The recommended practice is to create a skin on the
surface of the FE model and apply the residual stress and surface finish definitions on the skin
elements using the GROUP option in the job file (see Sections 4.3 and 4.4 for more detailed
information). Because of the combination of plane stress elements forming the skin of the component
and the underlying 3D elements, Abaqus (and possibly other FEA packages) may assign duplicate node
numbers between the skin and the solid bulk.
Although the fatigue calculation is not affected by the presence of duplicate node numbers,
Quick Fatigue Tool may report the results at incorrect locations. Problems may also arise when writing
results back to an Abaqus .odb file because Quick Fatigue Tool is unable to resolve the correct location
of the node on the finite element mesh. Thus, the visualization in Abaqus/Viewer could be incorrect.
The workaround in such cases is to use stresses at the element nodes. This ensures that each node in
the model has a unique identifier even in the presence of multiple element regions.
50
4. Analysis techniques
4.1 Background
Quick Fatigue Tool provides a selection of analysis techniques. These techniques provide useful tools
for performing your analysis more efficiently and effectively.
The techniques described in this section are specified in the job and environment files. For detailed
guidance on the usage of job file options and environment variables, consult the document
Quick Fatigue Tool User Settings Reference Guide.
4.2 Treatment of nonlinearity
4.2.1 Overview
Quick Fatigue Tool always assumes that the dataset stresses are linear elastic.
When a stress-based fatigue analysis algorithm is specified, the stresses are not corrected for the
effect of plasticity. This is because the stress-life methodology has been devised for linear elastic
material data.
When a strain-based fatigue analysis algorithm is specified, the stresses are corrected for the effect of
plasticity using the Ramberg-Osgood nonlinear elastic strain-hardening relationship. This function
approximates the local nonlinear stress from elastic stresses [8] [9].
Elastic stresses are converted to nonlinear elastic stresses using Equation 4.1.
Figure 4.1: Generic representation of the stress-strain
curve using the Ramberg-Osgood equation
51
The material constants and are the cyclic strain hardening coefficient and exponent, respectively.
The difference in the monotonic response between the elastic Hookean and the Ramberg-Osgood
model is shown in Figure 4.1. For the offset line, the parameter is defined as
, where
is the yield strength.
4.2.2 Limitations of the nonlinear model
In order to convert between linear and nonlinear quantities, Quick Fatigue Tool uses Neuber’s Rule,
which stipulates that the accumulated strain energy is the same at a notch as it would be in an elastic
stress field far away.
The correction therefore is only valid for local notch plasticity. For smooth specimens, care must be
taken. If the stress at the notch is below the yield strength, stress redistribution occurs, resulting in a
larger plastic zone. Neuber’s Rule does not work as well in these cases because it assumes that the
peak stress is localised and that the plastic zone is surrounded by a comparatively large elastic zone,
forcing the plastic zone to behave similarly to the nearby elastic stress field.
The Stress-Life methodology is based on the elastic stress at a point on the component which is not
affected significantly by local stress concentrations. Traditionally, the elastic stress is used with an
S-N curve which has been corrected with a notch factor to account for the presence of the stress
concentration. As such, the Stress-Life methodology is not intended to be used with true stress
quantities. This causes problems when performing fatigue estimates from elastic FEA, because the
stresses at the notch are typically over-estimated. This can result in highly conservative life predictions
compared to Strain-Life methods. For cases where the stress concentration is judged to be significant,
the following is recommended in lieu of the plasticity correction:
1. Limit the fatigue analysis to the elastic stress a small distance away from the notch and apply
a correction factor using the KT_DEF, KT_CURVE, NOTCH_CONSTANT and NOTCH_RADIUS
job file options. Notch factors for specific geometries are readily available in the literature
2. Use the Strain-Life methodology
52
4.3 Surface finish and notch sensitivity
4.3.1 Surface finish
Surface roughness has a strong influence on the component’s resistance to crack initiation [10]. While
FEA is able to account for stress concentrations which arise from geometric complexity, the effect of
surface finish cannot be modelled directly. Instead, a surface stress concentration factor, , may be
used to scale the endurance curve so that it corresponds more accurately to the surface strength of
the material.
The result of applying a stress concentration factor to the endurance curve is shown in Figure 4.2.
Quick Fatigue tool allows the surface finish to be defined in three ways:
1. As a surface stress concentration factor ( value)
2. From a list of surface finish types ( curve)
3. As a surface roughness value ( value)
Define the surface finish as a value
Job file usage:
Option
Value
KT_DEF
KT_CURVE
[ ]
where is a value for the surface stress concentration factor, .
Figure 4.2: Reference endurance curve for
and after applying a stress concentration factor of

53
Define the surface finish as an curve
To specify the surface finish from a list of surface finish types, a surface finish .kt file from the Data\kt
directory must be specified. The surface finish definition files contain pre-defined curves for various
surface finishes, as a function of the material’s ultimate tensile strength.
Job file usage:
Option
Value
KT_DEF
'surface-finish-file-name.kt'
KT_CURVE
where 'surface-finish-file-name.kt' is the name of the .kt file containing a list of surface finish definitions,
and is the curve number.
The file ‘default.kt is plotted in Figure 4.3 as an example. Based on the chosen curve and the ultimate
tensile strength of the material, Quick Fatigue Tool linearly interpolates to find the corresponding
value of . If the material’s ultimate tensile strength exceeds the range specified by the curve, the
last value of is used.
Figure 4.3:
curves for various surface finishes, from the file ‘default.kt’
54
The following .kt files and the available curves are shown below:
default.kt
Surface Finish
1
Mirror Polished Ra <= 0.25um
2
0.25 < Ra <= 0.6um
3
0.6 < Ra <= 1.6um
4
1.6 < Ra <= 4um
5
Fine Machined 4 < Ra <= 16um
6
Machined 16 < Ra <= 40um
7
Precision Forging 40 < Ra <= 75um
8
75um < Ra
juvinall-1967.kt
Surface finish
1
Mirror Polished
2
Fine-ground or commercially polished
3
Machined
4
Hot-rolled
5
As forged
6
Corroded in tap water
7
Corroded in salt water
rcjohnson-1973.kt
Surface finish
1
AA = 1uins
2
AA = 2uins
3
AA = 4uins
4
AA = 8uins
5
AA = 16uins
6
AA = 32uins
7
AA = 83uins
8
AA = 125uins
9
AA = 250uins
10
AA = 500uins
11
AA = 1000uins
12
AA = 2000uins
55
It is possible to specify the surface finish from a user-defined .kt file. The following file format must be
obeyed:
First column: Range of UTS values over which is defined
Second column: values for the first curve
Third column: values for the second curve
Nth column: values for the (N-1)th curve
Define the surface finish as an value
To specify the surface finish as a surface roughness () value, a surface roughness .ktx file from the
Data\kt directory must be specified. The surface roughness files contain curves over a range of
roughness values, as a function of the material’s ultimate tensile strength.
Job file usage:
Option
Value
KT_DEF
'surface-roughness-file-name.ktx'
KT_CURVE
where <filename> is the name of the .ktx file containing the curves and is the surface roughness,
.
The Niemann-Winter-Rolled-Steel.ktxfile is plotted in Figure 4.4 as an example. Quick fatigue tool
linearly interpolates to find the curve which corresponds to the user-specified value. If the
surface roughness value exceeds the maximum surface roughness defined in the data, the last set of
values are used.
Based on the ultimate tensile strength of the material, Quick Fatigue Tool linearly interpolates once
more to find the corresponding value of . If the material’s ultimate tensile strength exceeds the
range specified by the curve, the last value of is used.
56
The following .ktx files and surface roughness ranges are shown below:
Niemann-Winter-Cast-Iron-Lamellar-Graphite.ktx
UTS Range
0 2000Mpa
Range
1 200um
Niemann-Winter-Cast-Iron-Nodular-Graphite.ktx
UTS Range
0 2000Mpa
Range
1 200um
Niemann-Winter-Cast-Steel.ktx
UTS Range
0 2000Mpa
Range
1 200um
Niemann-Winter-Malleable-Cast-Iron.ktx
UTS Range
0 2000Mpa
Range
1 200um
Niemann-Winter-Rolled-Steel.ktx
UTS Range
0 2000Mpa
Range
1 200um
Figure 4.4:
curves for various surface roughness values, from the file ‘Niemann-Winter-Rolled-Steel.ktx’
57
It is possible to specify the surface finish from a user-defined .ktx file. The following file format must
be obeyed:
First row: Range of values over which values are defined
Second row: First UTS value, followed by corresponding values for each value
Third row: Second UTS value, followed by corresponding values for each value
Nth row:  UTS value, followed by corresponding values for each value
4.3.2 Effect of notch sensitivity
If the component contains a notch, the stress-life curve may require modification to account for the
notch sensitivity of the material, since the stresses which contribute to fatigue of a notched
component are not on the notch root surface, but a small distance into the subsurface. As such, for
notch-insensitive materials, the stress concentration factor in fatigue is different to the elastic stress
concentration factor, . This is termed the fatigue notch factor, .
The value of can be approximated in several ways, and is set in the environment file.
Environment file usage:
Variable
Value
notchFactorEstimation
{1.0 | 2.0 | 3.0 | 4.0 | 5.0 | 6.0}
1. Peterson (default)
2. Peterson B
3. Neuber
4. Harris
5. Heywood
6. Notch sensitivity
Peterson (default)
Quick Fatigue Tool is optimized to use stresses from finite element analysis. Therefore, the effect of
is implicit in the stress solution. By default, the reduction in fatigue strength due to elastic stress
concentration is calculated using results obtained by Peterson. Equation 4.4 is used to scale the value
of as a function of endurance [11].


The endurance curve is scaled by  at each value of N. The Peterson relationship is visualized
by Figure 4.5, for values of .
58
Peterson B
Peterson observed that, in general, good approximations for  loading can be obtained using
Equation 4.5 [12].
where is a characteristic length and is the notch root radius. The value of can be determined
empirically as a function of the ultimate tensile strength, :
Material
Steel


Aluminium Alloy

Neuber
For parallel side grooves, Neuber developed the following approximate formula for the notch factor
for  loading [13]:
where is a characteristic length.
Figure 4.5:
 as a function of endurance, for
.
59
Harris
Harris proposed the relationship in Equation 4.7 [14].



where  is a characteristic length. Suggested values of  are shown in the table below
Material

Steel


Aluminium Alloy

Heywood
Heywood proposed the relationship in Equation 4.8 [15].
where is a characteristic length. The value of varies depending on the notch type. Typical values
proposed for steel are shown in the table below. The values assume that the notch root radius is
measured in inches.
Notch Type
Hole

Shoulder

Groove

Notch sensitivity
The fatigue notch factor can be defined in terms of the notch sensitivity, .
Typical values of for steels and aluminium alloys are shown in Figures 4.6-7 for bending and torsion,
respectively [16].
60
Figure 4.6: Notch sensitivity factors for steels and aluminium alloys (bending)
Figure 4.7: Notch sensitivity factors for steels and aluminium alloys (torsion)
61
Defining notch parameters
The notch characteristic length and the notch root radius are defined in the job file.
Job file usage:
Option
Value
NOTCH_CONSTANT
NOTCH_RADIUS
The notch constant is defined according to the table below.
Notch factor estimation method
Notch constant
Peterson (default)

Peterson B
Neuber
Harris

Heywood
Notch sensitivity
The notch root radius defines the parameter in Equations 4.5-4.8.
Specifying notch factors for FEA stresses
Since Quick Fatigue Tool assumes that the FEA stresses account for the effects of geometry, the default
meaning of is that of a supplementary factor which describes surface finish effects which cannot
easily be modelled in finite elements. To that end, the user must be careful when considering the
inclusion of the fatigue notch factor if the stresses originate from FEA.
If the S-N data is produced from smooth (un-notched) specimens then the nominal stress is equal to
the local stress, as there is no stress gradient effect. In this case, the fatigue notch factor is not required
since the FE solution provides the local stress at the notch surface. However, if the S-N data originated
from a notched test, then the FEA stresses at the notch will produce excessively conservative fatigue
life predictions. In such cases, the user must follow the procedure outlined by their chosen stress-life
guideline in order to estimate the pseudo nominal stress a certain distance away from the notch tip,
and use this stress on the notched S-N curve instead.
62
The fatigue notch factor is estimated from the value of .
Job file usage:
Option
Value
KT_DEF
where is not the surface finish factor, but the elastic stress concentration factor. By defining the
notch sensitivity constant and the notch root radius, the corresponding notch sensitivity is estimated.
The stresses (or the S-N curve) will be scaled in the same way as is done with a standard surface finish
definition (Figure 4.2).
4.3.3 Modelling guidance
Specifying on a material surface
Unless analysis groups are defined, the surface finish definition is applied to every analysis item in the
model. This behavior is incorrect if the model contains subsurface nodes. If subsurface nodes are being
analysed, the recommended practice is to define a skin on the finite element model and apply the
surface finish definition to the skin using a separate analysis group.
Job file usage:
Option
Value
GROUP
{'skin-group-file-name.*', 'DEFAULT'}
The procedure for creating analysis groups is described in Section 4.7. The surface finish is then
defined in the job file.
Job file usage:
Option
Value
KT_DEF
{'surface-finish/roughness-file-name.kt/ktx', [ ]}
KT_CURVE
63
Specifying directly
If the fatigue test data was measured for a smooth specimen but the component contains a notch, the
user can specify the fatigue notch factor directly if it is already known. By assuming that the material
is fully notch-sensitive (), then . This means that the surface finish factor can be used as
the notch sensitivity factor.
Environment file usage:
Variable
Value
notchFactorEstimation
6.0
Job file usage:
Option
Value
KT_DEF
KT_CURVE
[ ]
NOTCH_CONSTANT
1.0
Figure 4.8: Fatigue results generated by testing plain and V-notched cylindrical specimens of S690 steel under
rotating bending [17]
64
Take Figure 4.8 as an example. This data was obtained by Susmel [17], and shows the S-N curves for a
notched and a smooth specimen. Using the data at the endurance limit (two million cycles), the value
of is estimated to be 
  at  probability of survival. For a load ratio of ,
a cycle with a stress amplitude of  will produce the same life on a smooth specimen as a
cycle with a stress amplitude of  on the notched specimen.
This can be verified in Quick Fatigue Tool by defining the smooth specimen material data with the
following S-N data points:






The two definitions below should result in a fatigue life of two million cycles. The Uniaxial Stress-Life
algorithm is used for both definitions. For the second definition, the notch sensitivity is used as the
fatigue notch factor estimation method.
Definition A
Job file usage:
Option
Value
HISTORY
[309.1, -309.1]
KT_DEF
1.0
NOTCH_CONSTANT
[ ]
Definition A
Job file usage:
Option
Value
HISTORY
[172.6, -172.6]
KT_DEF
1.791
NOTCH_CONSTANT
1.0
65
4.4 In-plane residual stress
4.4.1 Overview
Many engineering components exhibit surface residual stresses due to their manufacturing process.
Compressive residual stresses are often introduced by design. However, tensile residual stresses can
accelerate fatigue damage accumulation.
An in-plane residual stress component can be specified directly in the job file.
Job file usage:
Option
Value
RESIDUAL
The stress is assumed to act uniformly in all directions and is added to the mean stress of each cycle.
Unless analysis groups are defined, the residual stress definition is applied to every analysis item in
the model. This behavior is incorrect if the model contains subsurface nodes. If subsurface nodes are
being analysed, the recommended practice is to define a skin on the finite element model and apply
the residual stress to the skin using a separate analysis group.
Job file usage:
Option
Value
GROUP
{'skin-group-file-name.*', 'DEFAULT'}
The procedure for creating analysis groups is described in Section 4.7. The residual stress is then
defined in the job file.
Job file usage:
Option
Value
RESIDUAL
[, 0.0]
Since the residual stress is applied directly to the fatigue cycle and assumes the orientation of the
critical plane, the quantity cannot be visualized with field output. For example, the variable SMAX does
not include the effect of residual stress. Furthermore, the residual stress is not considered by nodal
elimination.
4.4.2 Limitations
Definition of residual stress is not compatible with the BS 7608 algorithm. If residual stress is defined
with Findley’s Method, the stress is added to the cycle during the damage calculation instead of the
mean stress.
66
4.5 Analysis speed control
4.5.1 Pre-processing time histories
Load histories often contain data which does not contribute to fatigue and can have a spurious effect
on the fatigue calculation. Take the signal in Figure 4.9 as an example.
Analysing signals which contain small cycles puts additional computational load on the cycle counting
algorithm and can impact the analysis time significantly, without improving the estimate of the fatigue
damage. Quick Fatigue Tool offers three procedures for removing redundant cycles.
1. Pre-gate load histories
2. Gate tensors
3. Noise reduction
Quick Fatigue Tool assumes that there is no correlation between the phase of the load histories and
the damage parameter on the critical plane. Therefore, the default behavior is to gate the stress
tensors as this is the most reliable method.
Figure 4.9: Noisy test signal
67
Pre-gate load histories
When load history pre-gating is enabled, the load history from each channel is converted into a
separate peak-valley sequence before being combined with the other channels.
Environment file usage:
Variable
Value
gateHistories
{0.0 | 1.0 | 2.0}
historyGate
[, ,, ]
where is the number of loading channels. The gating values are defined as the percentage of the
maximum component in the load history.
Figure 4.10 shows the result of gating the noisy signal with a  gating criterion. The pros and cons
of load history gating are listed below.
Pros:
load history gating is performed once for each loading channel before the start of the analysis,
so it is very fast; and
for simple loadings, load history gating has a similar accuracy to tensor gating.
Cons:
fatigue results may be in error if the loading consists of multiple histories. The method is
applied separately to each history so there is no guarantee that the phase relationship
between the loading channels is maintained.
Figure 4.10: Filtered signal
68
When using pre-gated load histories with multiple load channels, the user should compare the change
in life values between the gated and the original signal. If the difference is significant, load history pre-
gating should be disabled.
When load history pre-gating is enabled and Quick Fatigue Tool detects a constant amplitude loading
condition, then only the first cycle is analysed. This avoids invoking the rainflow cycle counting
algorithm unnecessarily. The number of repeats is automatically adjusted to account for the additional
cycles, according to Equation 4.10.

[4.10]
The total number of repeats, , is the product of the number of repeats defined by the REPEATS job
file option, , and the number of repeats in the load history, .
History point is considered to be constant amplitude with respect to the point  provided that
the two points lie within the user-defined tolerance specified by the historyGate environment variable.
If multiple gating values are specified, then the first value will be used.
Gate tensors
When tensor gating is enabled, the original load history is used to determine the principal stress
history and the damage parameter on the critical plane. The damage parameter is then converted into
a peak-valley sequence prior to cycle counting.
Environment file usage:
Variable
Value
gateTensors
{0.0 | 1.0 | 2.0}
tensorGate
[, ,…, ]
where is the number of loading channels. The gating values are defined as the percentage of the
maximum component in the stress tensor which defines the damage parameter.
The pros and cons of tensor gating are listed below.
Pros:
The most accurate method of gating. Since the damage parameter accounts for the
combination of the fatigue loading, the phase relationship between the loading channels is
always maintained
Cons:
The damage parameter is gated per plane, per node, thus tensor gating is much slower than
pre-gated load histories
69
If it is necessary to remove intermediate data (points which lie between peaks and valleys), but
additional gating is not required, the gate value can be set to zero. Quick Fatigue Tool will use a zero
derivative method and all peak-valley pairs will be retained.
Environment file usage:
Variable
Value
tensorGate
0.0
historyGate
0.0
Quick Fatigue Tool includes an alternative peak-valley analysis algorithm, written by Adam Nielsony.
This may be used in cases where the gating criterion is not known. The recommended practice is to
use a gating method.
Environment file usage:
Variable
Value
gateHistories
2.0
gateTensors
2.0
Noise reduction
It is possible to apply noise reduction to the load histories prior to analysis. Quick Fatigue Tool uses a
low-pass filter which removes spikes in the data.
Environment file usage:
Variable
Value
noiseReduction
{0.0 | 1.0}
numberOfWindows
where is the number of averaging segments used to filter the signal.
Care should be taken when using the low-pass filter, since the stress amplitude is always reduced. This
may result in excessively optimistic fatigue life results. Noise reduction should not be used as an
alternative to gating, and its use is not recommended in general unless the load signal is highly affected
by measurement noise.
70
4.5.2 Determining load proportionality
If the direction of the principal stress does not change during the loading, then the stresses are
proportional and critical plane searching is not necessarily required.
Quick Fatigue Tool automatically checks the model for proportional loading before the start of the
analysis and skips critical plane searching at these regions for certain algorithms.
Environment file usage:
Variable
Value
checkLoadProportionality
{0.0 | 1.0}
The principal stress history of the loading is calculated at each location in the model, along with the
orientation of the first principal stress. If the largest change of the angle of the first principal stress
does not exceed the specified tolerance, the loading is assumed to be proportional.
Environment file usage:
Variable
Value
proportionalityTolerance

The purpose of defining a tolerance is to account for solution noise which may cause the direction of
the principal stresses to “wobble”, even if the loading is theoretically proportional.
Load proportionality checking is compatible with the following fatigue analysis algorithms:
Stress-based Brown-Miller
Normal Stress
BS 7608
71
4.5.3 Specifying the analysis region
It is possible to restrict the analysis to a specific region of the model. For very large models, it may be
economical to perform the analysis only at the locations where fatigue failure is likely to occur.
Alternatively, if the location of fatigue failure has already been determined by a previous analysis, an
additional analysis may be performed at this location with additional output and/or more rigorous
critical plane searching.
There are five ways to specify the analysis region:
1. Whole model
2. ODB element surface
3. Maximum principal stress range
4. Hotspot
5. User-defined
Option
Value
ITEMS
{'ALL' | 'SURFACE' | 'MAXPS' | 'file-name.*' | [,…, ]}
Whole model
When ITEMS='ALL', the whole model is used as the analysis region. If the DATASET option is not used
for analysis, the ITEMS option is ignored.
ODB element surface
When ITEMS='SURFACE', the analysis region is restricted to items on the element free surface of the
model Abaqus .odb file. An element is considered to be on the surface if it has at least one node on
the surface. This option is useful in the majority of cases, where fatigue cracks initiate on the
component surface. If subsurface cracks are likely to initiate, ITEMS='ALL' should be used instead.
Surface detection requires both the .odb file and part instance name to be specified.
Job file usage:
Option
Value
OUTPUT_DATABASE
' model-odb-file-name.odb '
PART_INSTANCE
'part-instance-name'
The specified part instances should match those defined in the stress dataset files. If the user specifies
part instances which do not exist in the dataset, the surface detection will not take effect. The element
surface is read according to the user-specified result position.
72
Job file usage:
Option
Value
RESULT_POSITION
{'ELEMENT NODAL' | 'UNIQUE NODAL' | 'CENTROID'}
The user should ensure that the specified result position matches the position of the stress datasets.
Surface detection is not supported for integration point results.
If there is no .odb file specified, the whole model is used as the analysis region. ODB element surface
detection is not supported for uniaxial analysis methods.
If the model output database contains shell elements, Quick Fatigue Tool treats the element surface
as the entire shell. Alternatively, the user can specify to treat the element surface as free shell faces.
Environment file usage:
Variable
Value
shellFaces
{0.0 | 1.0}
The difference between these two surface definitions is shown in Figure 4.11.
The surface detection algorithm supports most Abaqus elements with displacement degrees of
freedom. A list of compatible elements is found in Appendix IV “List of supported elements for surface
detection” of the document Quick Fatigue Tool Appendices.
If the stress dataset contains element-nodal or centroidal stress data, Quick Fatigue Tool only searches
the elements in the .odb file which are defined in the dataset. If the dataset contains unique nodal
stress data, the entire part instance is always included by the surface detection algorithm. This
behavior is controlled by the searchRegion environment variable. Values of 0 and 1 indicate that
elements from the stress dataset(s) or the entire part instance will be included by the surface
detection algorithm, respectively.
Figure 4.11: 4-node shell elements whose free faces are shown by dashed faces. Surface
nodes are indicated by solid black circles. (L) Treat shell surface as whole shell. (R) Treat
shell surface as free shell faces.
73
Environment file usage:
Variable
Value
searchRegion
{0.0 | 1.0}
When the search region is limited to dataset elements, Quick Fatigue Tool treats the dataset as the
entire model. This is the preferred method because it is much faster in cases where the part instance
contains a very large number of elements compared to the stress dataset. However, element
boundaries which exist in the dataset may, in reality, be attached to elements in the .odb file which
do not constitute a free surface. Therefore, when searchRegion=0.0, the surface detection algorithm
may overestimate the number of elements on the surface.
If more than one part instance is specified with PART_INSTANCE, then all elements in each part
instance are always included by the surface detection algorithm.
Whenever Quick Fatigue Tool reads the surface of an .odb file, the surface items for the current model
and specified part instances are written to the surface definition file
'Data\surfaces\<model-ID>_surface.dat'. At the start of each job, if ITEMS='SURFACE', the surface
definition file is used to extract the model surface instead of the .odb file; this is much more time-
efficient when running the same job in multiple configurations. This behavior is set with the following
environment variable:
Environment file usage:
Variable
Value
surfaceMode
{0.0 | 1.0}
Maximum principal stress range
When ITEMS='MAXPS', the analysis region is restricted to the item with the largest principal stress
range. This allows the user to quickly identify an analysis item that is likely to reside in one of the
damage hotspots in the model.
This setting can be useful for very large models where even a simple fatigue analysis (such as Stress
Invariant Parameter) could take a long time to complete. Using the 'MAXPS' option allows the user to
select a more rigorous analysis algorithm on a single analysis item without first having to run a whole
model analysis. This option is not intended to replace a complete analysis, however; the location of
maximum stress range does not necessarily coincide with the location of maximum fatigue damage.
Therefore, the 'MAXPS' option should be used with great care if the loading is non-proportional.
Even if the job contains non-default group definitions, Quick Fatigue Tool will still search the whole
model as defined by the DATASET option. If the item with the largest principal stress range exists
outside any of the groups, the analysis will be aborted. This is because the code cannot determine
valid properties for an item which does not have any material or analysis data associated with it.
74
Items listed in a text file
When ITEMS='file-name.*', the analysis region is restricted to items defined in a text file. This file may
be user-defined, or it may be created automatically with the HOTSPOT job file option:
Job file usage:
Option
Value
HOTSPOT
{0.0 | 1.0}
Quick Fatigue Tool will save a list of items whose lives fall below the design life (specified by the
DESIGN_LIFE job file option) to 'hotspots.dat' in the folder Project\output\<jobName>\Data Files.
Additionally, any of the following files generated by Quick Fatigue Tool may be specified:
'<model-ID>_surface.dat', 'warn_lcf_items.dat', 'warn_overflow_items.dat' or
'warn_yielding_items.dat'.
User-defined item ID list
When ITEMS=[,…, ], the analysis region is restricted to the item IDs  to . The item numbers
correspond to rows in the stress dataset file.
Additional guidance
Consider the FEA definition in Figure 4.12. If ITEMS=16, the analysis will only consider the 16th item in
the definition. Note that if the definition file contains a header, the value of ITEMS will not correspond
to the line number in the file.
After each analysis, Quick Fatigue Tool writes to the message file the ID(s) of the item(s) in the stress
dataset(s) with the worst life. This value can be used in conjunction with the ITEMS option to re-run
the analysis at the worst location in the model.
Figure 4.12: Example FEA definition file
75
4.5.4 Nodal elimination
Introduction
The analysis time can be reduced by ignoring analysis items whose maximum stress is unlikely to cause
damage. When the nodal elimination algorithm is enabled, Quick Fatigue Tool checks the maximum
principal stress range at each item before beginning the analysis. If the stress is below a certain
percentage of the conditional stress, , then the item is not included for analysis. The value of
 is calculated as a function of the target life, .
Nodal elimination is enabled from the environment file.
Environment file usage:
Variable
Value
nodalElimination
{0.0 | 1.0 | 2.0}
If nodalElimination=0.0, nodal elimination is disabled.
If nodalElimination=1.0, is taken as the material’s constant amplitude endurance limit () by
default. The value of  is the stress amplitude which will result in a life of cycles.
The value of  may be specified directly if a user-defined endurance limit is specified in the
environment file.
Environment file usage:
Variable
Value
enduranceLimitSource
3.0;
userEnduranceLimit
;
If nodalElimination=2.0, is taken as the life defined by the option DESIGN_LIFE in the job file. By
default, DESIGN_LIFE='CAEL', in which case is taken from the value of  defined in the
material.
For example, if an analysis is run with the Stress-based Brown-Miller algorithm and
nodalElimination=1.0, the conditional stress is obtained from Equation 4.11:


[4.11]
where  is the conditional stress at which the life is equal to the constant amplitude endurance
limit, . The analysis item is then removed if the following inequality is satisfied:

[4.12]
76
where  is the maximum difference between the first and third principal stresses in the
loading and  is the elimination threshold scale factor.
Scaling the conditional stress with 
If the fatigue loading contains only one cycle, then the conditional stress can be determined directly
from the fatigue limit. However, if the loading contains multiple cycles then it is possible for finite life
even if the majority of the cycles are below the fatigue limit. Consequently, it is necessary to use a
reduced value of the conditional stress such that the nodal elimination algorithm is effective for
complex loads.
The elimination threshold scale factor is set in the environment file.
Environment file usage:
Variable
Value
thresholdScalingFactor

The default value of  is 0.8 (80%).
77
4.5.5 Principal stress calculation
Quick Fatigue Tool can use the built-in function eig() to determine the principal stress history for the
fatigue loading. Since this function only accepts two-dimensional data, the calculation can be very
time-consuming. This is because the principal stresses are calculated separately for each point in the
load history.
For very large models, the three-dimensional Eigensolver written by Bruno Luong is recommended.
This method calculates the principal stresses for the entire load history in a single calculation for each
analysis item, resulting in a much faster calculation.
Environment file usage:
Variable
Value
eigensolver
{1.0 | 2.0}
Luong’s method is used by default. Small numerical round-off errors in the calculation may cause
Quick Fatigue Tool to report a very small mean stress in the results, even if the loading has no mean
stress; this does not affect the fatigue life result.
78
4.6 Analysis groups
4.6.1 Overview
Analysis groups are used to define regions in the model having distinct properties. Analysis groups can
have their own definitions for the following job file options:
Property
Job file option
Material properties
MATERIAL
S-N data scale factor
SN_SCALE
S-N knock-down curves
SN_KNOCK_DOWN
Fatigue Reserve Factor envelope definition
FATIGUE_RESERVE_FACTOR
Surface finish definition
KT_DEF
Surface finish curve
KT_CURVE
Residual stress
RESIDUAL
Notch sensitivity constant
NOTCH_CONSTANT
Notch root radius
NOTCH_RADIUS
Analysis groups can have their own definitions for the following environment variables:
Property
Environment variable
Goodman envelope definition
modifiedGoodman
Goodman mean stress limit
goodmanMeanStressLimit
User-defined Walker parameter
userWalkerGamma
User-defined fatigue limit
userFatigueLimit
User FRF tensile mean stress normalization
parameter
frfNormParamMeanT
User FRF compressive mean stress
normalization parameter
frfNormParamMeanC
User FRF stress amplitude normalization
parameter
frfNormParamAmp
79
Analysis groups can be defined in two ways:
1. An item ID list
2. An FEA subset
An item ID list is a list of indexes corresponding to the row numbers in the stress dataset(s) as they are
defined in the file. An FEA subset is a list of position IDs referencing the location of the elements,
nodes, centroids or integration points in the model.
Analysis groups are declared in the job file.
Job file usage:
Option
Value
GROUP
{'group-file-name-1.*',…, 'group-file-name-.*'}
Groups are defined as ASCII text files.
4.6.2 Defining analysis groups as an item ID list
Consider the stress dataset file in Figure 4.13.
The dataset consists of two quadratic elements (eight-node hexahedrons). The dataset can be split
between two analysis groups, each defining one of the elements in the dataset. This can be achieved
by creating an item ID list defining each element. For example, the ID list for the first element is the
integer series from 1 to 8, while the second element is defined as the integer series from 9 to 16.
These are simply the row numbers corresponding to the nodes of the two elements. Example text file
contents for each group are given below.
Figure 4.13: Example stress dataset
80
‘element 1.txt’
‘element 2.txt’
1
9
2
10
3
11
4
12
5
13
6
14
7
15
8
16
4.6.3 Defining analysis groups as an FEA subset
Item ID lists are a direct and relatively simple way of defining analysis groups for small models.
However, consider the model of an excavator arm shown by Figure 4.14.
In this instance, creating groups for regions A, B and C with an item ID list would be a very cumbersome
task. Instead, FEA subsets can be used which define the groups by their element labels. In Abaqus, this
is easily achieved by creating a display group of the region of interest, then exporting the element
labels as an .rpt file. The process of generating such a file is exactly the same as for generating FEA
stress datasets, and is discussed in detail in Section 3.2.
Figure 4.14: Finite element model of an excavator arm with two regions of interest
Region A
Region B
Region C
81
For example, consider again the dataset in Figure 4.13. An FEA subset may then resemble that shown
in Figure 4.15.
The group is literally a subset of the element labels from the original dataset. When using FEA subsets,
the following guidance should be observed:
1. The results position between the FEA subset and the original dataset must agree, i.e. if the
original dataset uses element-nodal position labels, then so should the FEA subset
2. Including field data with the FEA subset is not compulsory. However, since Abaqus requires at
least one field to be exported, FEA subsets written by Abaqus/Viewer will always contain at
least one column of field data
It is quite possible that a group defined as an FEA subset will reference more than one element with
the same position label. In order for the group definition to be correct, Quick Fatigue Tool must
somehow match the ID to the correct location in the FE model. In the event that duplicate IDs are
found in a group, field data in the group definition can be used to check the stress tensor of the
duplicate IDs against the tensors in the original stress dataset. Therefore, it is recommended that FEA
subsets retain the same field information as the original datasets.
Quick Fatigue Tool automatically attempts to resolve duplicate IDs based on the availability of the field
data; no additional intervention is required by the user. However, if the loading is defined as either a
multiple scale and combine or as a dataset sequence (the DATASET job file option was specified with
more than one argument), then the field data from the group file must agree with the last dataset in
the loading.
4.6.4 Arranging groups in the job file
Both item ID lists and FEA subsets can be used together for an analysis, and the groups do not have to
be mutually exclusive. Quick Fatigue Tool will process each group as they are encountered in the
GROUP job file option and will exclude the items so that they will not be read a second time if they
appear in subsequent groups. Therefore, the order in which groups are defined will affect which group
the items are assigned to if any of the definitions are inclusive of each other. The hierarchical nature
of the group reading process is illustrated by Figure 4.16.
Figure 4.15: FEA subset of the dataset in Figure 4.9
82
Group 1 is a subset of Group 2, which in turn is a subset of Group 3. Consider the case where the
groups are defined in the job file.
Job file usage:
Option
Value
GROUP
{'group-1.*', 'group-2.*', 'group-3.*'}
Quick Fatigue Tool will first read Group 1. All items from Group 1 will be read into the group definition,
and excluded from re-definition in later groups. Hence, Group 2 will include all of its items except
those already belonging to Group 1. The same logic will apply to Group 3, which will have all items
from Group 2 and Group 1 excluded from its definition.
The groups defined in the excavator model from Figure 4.10 could be defined in the job file.
Job file usage:
Option
Value
GROUP
{'region-A.*', 'region-B.*'}
Quick Fatigue Tool will analyse all of Region A and Region B. None of the items from Region C will be
analysed. In this case, the order does not matter since the groups are mutually exclusive.
If all three regions are to be analysed, but only Region C requires individual properties, the following
group definition may be used.
Job file usage:
Option
Value
GROUP
{'region-C.*', 'DEFAULT'}
Group 3
Group 2
Group 1
Figure 4.16: Group hierarchy
83
In this instance, Quick Fatigue Tool will analyse Region C with individual properties for that group,
followed by all other items in the model. In general, use of the 'DEFAULT' parameter instructs the
program to analyse all remaining items in the model (all other items in the original dataset which do
not belong to any preceding groups). The 'DEFAULT' parameter may only be used as the last argument
in the GROUP job file option.
The 'DEFAULT' parameter is used by its self to analyse the whole model, with no group definitions.
Job file usage:
Option
Value
GROUP
{'DEFAULT'}
Quick Fatigue Tool determines whether a group definition is an item ID list or an FEA subset based on
the contents of the file, and depending on the user setting in the environment file.
Environment file usage:
Variable
Value
groupDefinition
{0.0 | 1.0}
Using this default value, the application will assume that the group definition is an item ID list if there
is only one column of data in the file, and an FEA subset if there is more than one column. By setting
groupDefinition to a value of 1.0, an FEA subset will always be assumed.
84
4.6.5 Assigning properties to groups
Group properties (job file options and environment variables) are specified by assigning multiple
definitions according to the number of groups in the analysis.
Usually, the number of property definitions should match the number of groups in the analysis. If the
analysis contains groups where , and the user assigns a single definition to a property,
Quick Fatigue Tool automatically propagates that definition to all of the analysis groups. Otherwise,
the number of property definitions must exactly match the number of group definitions.
Most properties require valid definitions for number of analysis groups.
Job file usage:
Option
Value
MATERIAL
{'material-file-1.mat',…, 'material-file-.mat'}
This requirement applies to the following job file options and environment variables:
Job file option
Environment variable
MATERIAL
frfNormParamMeanT
SN_SCALE
frfNormParamMeanC
NOTCH_CONSTANT
frfNormParamAmp
FATIGUE_RESERVE_FACTOR
userWalkerGamma
RESIDUAL
userEnduranceLimit
midifiedGoodman
goodmanMeanStressLimit
S-N knock-down factors may not be required for every analysis group.
Job file usage:
Option
Value
SN_KNOCK_DOWN
{'knock-down-file-1.kd', [ ], 'knock-down-file-.kd'}
In this example, knock-down factors have been specified in groups and , and none is define for
Group . In cases where some groups do not require a definition, this must be indicated by an empty
assignment (), otherwise the group definition will be processed incorrectly.
A combination of surface finish definition files and values can be specified over several groups.
Job file usage:
Option
Value
KT_DEF
{'surface-finish-file-1.kt', , 'surface-finish-file-.kt'}
85
In such cases, the KT_CURVE option needs only to be specified according to the number of surface
finish files defined by KT_DEF.
Job file usage:
Option
Value
KT_CURVE
[, ]
In this example, the curve numbers and correspond to the surface finish files
'surface-finish-file-1.kt' and 'surface-finish-file-.kt', respectively; a curve number for the value is not
required since the surface finish is defined directly.
A list of all properties eligible for group definitions, along with typical syntax is provided in the table
below.
Job file option/Environment variable
Example definition
MATERIAL
{'material-file-1.mat',…, 'material-file-.mat'}
SN_SCALE
[,…, ]
SN_KNOCK_DOWN
{'knock-down-file-1.kd', 'knock-down-file-.kd'}
FATIGUE_RESERVE_FACTOR
{,…, 'msc-file-name-.msc'}
KT_DEF
{, 'surface-finish.file-2.kd', 'surface-finish.file-3.kd'}
KT_CURVE
[,…, ]
RESIDUAL
[,…, ]
NOTCH_CONSTANT
[,…, ]
NOTCH_RADIUS
[,…, ]
frfNormParamMeanT
{, '<param2>',…, }
frfNormParamMeanC
{, '<param2>',…, '<paramn>'}
frfNormParamAmp
{'<param1>', '<param2>',…, }
userWalkerGamma
[,…, ]
userEnduranceLimit
[,…, ]
midifiedGoodman
[,…, ]
goodmanMeanStressLimit
{, '<param2>',…, }
86
The following caveats should be noted for the use of analysis groups:
materials must be defined as a cell;
values for SN_SCALE are only required if USE_SN = 1.0;
a combination of surface finish .kt/.ktx files and/or surface finish values can be used within
the same KT_DEF statement;
the number of values in KT_CURVE need only reflect the number of .kt/.ktx files defined in
KT_DEF;
the number of values in frfNormParamMeanT, frfNormParamMeanC and frfNormParamAmp
need only reflect the number of custom FRF envelope definitions in
FATIGUE_RESERVE_FACTOR;
the number of values in goodmanMeanStressLimit need only reflect the number of zero-
valued entries in modifiedGoodman (the Goodman limit stress can only be defined for the
standard Goodman envelope);
if the 'DEFAULT' parameter is used as the last argument in a group definition, the default
group is included in the total number of groups;
if the default analysis algorithm or mean stress correction is specified, the algorithm and
mean stress correction used for analysis is chosen from the last argument of MATERIAL; and
if the DESIGN_LIFE option is set to the material’s constant amplitude endurance limit
('CAEL'), the endurance value is chosen from the last argument of MATERIAL.
4.6.6 Limitations
Analysis groups in Quick Fatigue Tool currently do not support multiple definitions of the following:
mean stress corrections/analysis algorithms; and
BS 7608 material properties.
If a different algorithm or mean stress correction is required for each analysis group, the workaround
is to split the analysis into multiple jobs and superimpose the fatigue results onto a single field output
file using the option CONTINUE_FROM. This technique is discussed in Section 4.8.
87
4.7 S-N knock-down factors
4.7.1 Overview
A set of S-N scale factors can be applied to the material S-N data in the form of knock-down factors,
which scale the stress data points for each specified life value. Knock-down factors are defined in a
separate .kd file and specified in the job file.
Job file usage:
Option
Value
SN_KNOCK_DOWN
{'knock-down-file-name.kd'}
USE_SN
1.0
4.7.2 Defining a knock-down curve file
The knock-down curve file is defined as follows:
First column: Life values at which knock-down factors are to be applied
Second column: Knock-down factors corresponding to each life value
An example .kd file is given by Figure 4.17.
The knock-down factors are applied to the S-N data before the analysis to produce the modified
endurance curve. The life values in the .kd file are treated as sample points and as such, they do not
have to match the position of the S-N data points. Quick Fatigue Tool will automatically interpolate
and extrapolate the S-N data before scaling each data point. An example of this process is shown by
the following table. Note that bracketed values in bold have been interpolated or scaled.
Figure 4.17: Example .kd file containing life
values in the first column and knock-down
factors in the second column
88
Original S-N curve
Knock-down curve
Scaled S-N curve

(800)
(10)
1.1
10
(880)
(10)
(1725)
(50)
1
50
(1725)
(50)
(1400)
(100)
1
100
(1400)
(100)
700
1000
0.8
1000
(560)
(1000)
350
10000
(0.75)
(10000)
(263)
(10000)
(313)
(100000)
0.7
100000
(219)
(100000)
(313)
(100001)
0.5
100001
(157)
(100001)
(280)
(1000000)
0.5
1000000
(140)
(1000000)
250
10000000
(0.4)
(10000000)
(100)
(10000000)
-
-
0.3
100000000
-
-
Knock-down data specified below the minimum life of the original S-N data is extrapolated. However,
Quick Fatigue Tool assumes that the last data point in the original S-N data represents the material’s
endurance limit and thus knock-down data provided beyond this point is not extrapolated.
The original S-N, knock-down and modified S-N curves are illustrated by Figures 4.18-20. If the material
contains S-N curves for multiple R-ratios, each curve is scaled by the knock-down curve before the
beginning of the analysis.
89
Figure 4.18-20: Original S-N curve (top), knock-down factors (middle)
and modified S-N curve (bottom)
90
4.7.3 Example applications
Knock-down curves can be used in addition to the surface finish definition in order to account for
additional manufacturing effects. Imperfections of the manufacturing process can lead to defects,
which reduce the endurance of the material.
Knock-down curves may be used with cast metals where inclusions result in a local loss of fatigue
performance. Another application is in the injection moulding of plastic components. During this
process multiple flow regions can meet, causing a weld line at the interface of the two melts. During
cooling of the newly formed part, voids may develop wherein large thermal gradients exist. Both of
these phenomena can be accounted for via the use of S-N knock-down factors.
One possible approach to using knock-down factors would be to identify the nodes on the FE model
which represent the manufacturing defect, and export a dataset file containing the node numbers and
field data, the process of which is described in Section 3.2. This dataset can then be used to define a
group in the job file with an individual knock-down curve applied. Such a configuration could be
achieved by using the following options.
Job file usage:
Option
Value
MATERIAL
{'knock_down.mat', 'default.mat'}
USE_SN
1.0
SN_KNOCK_DOWN
{'knock_down_curve.kd', [ ]}
GROUP
{'defect_group.rpt', 'DEFAULT'}
91
4.7.4 Exporting knock-down curves
The knock-down curves can be exported to MATLAB figures from the environment file.
Environment file usage:
Variable
Value
figure_KDSN
{0.0 | 1.0}
S-N knock-down curves are exported for groups which have knock-down factors specified. MATLAB
figures must be requested in the job file.
Job file usage:
Option
Value
OUTPUT_FIGURE
1.0
If the material contains S-N curves for multiple R-ratios, only the curve for  is exported. If an
 curve is not defined then it is automatically interpolated.
92
4.8 Analysis continuation techniques
4.8.1 Overview
Quick Fatigue Tool provides the capability to perform an analysis as a continuation of a previous job.
Field output from the current job is written onto the field output from a previous job. An analysis
which uses the continuation feature:
can be used to model block loading, where each job defines a distinct loading event;
can be used to assign different analysis algorithms to multiple regions in a model; and
allows the user to specify completely new definitions for any job file option for each job.
When used in conjunction with the ODB interface, analysis continuation:
can be used to superimpose field data onto the same mesh;
can be used to append field data onto a mesh at locations different to those which were
analysed in the previous job; and
a combination of the above.
A job which uses analysis continuation runs in the usual way. At the end of the analysis, field data is
superimposed onto the field output file from a specified job. The rules for combining field data depend
on the variable type, and are listed in the table below.
Variables
Rule
D
L; LL; DDL

FOS; SFA; FRFR; FFH; FRFV

FRFW
Derived from the values of  and

SMAX; SMXP; SMXU; TRF; WCM; WCA; WCDP;
YIELD

WCATAN
Derived from the values of  and 
The subscripts and correspond to the first and second jobs, respectively. The superscript
corresponds to the superimposed field variable.
93
4.8.2 Referencing the previous job
The previous analysis is specified in the job file of the current analysis.
Job file usage:
Option
Value
CONTINUE_FROM
'previous-job-name'
The name of the previous job is the name given by the option JOB_NAME in the previous job file. Field
output must be requested for the first job. Field output is written automatically for the second job,
provided that a valid definition of CONTINUE_FROM is specified.
Job file usage:
Option
Value
OUTPUT_FIELD
1.0
4.8.3 Example applications
Multiple analysis algorithms
Analysis continuation can be used to circumvent the limitation of analysis groups, which does not
support multiple definitions for the analysis algorithm or mean stress correction. Analysis continuation
allows the user to define the algorithm and mean stress correction for multiple regions in the model,
and analyse each region as a separate job. Field data is superimposed onto the previous field data file
in a chained fashion by specifying the name of the previous job for each analysis. Finally, the
cumulative results of all analyses may be written to an .odb file using the ODB interface, which is
discussed in Section 10.4.
Multiple block loading
The definition of multiple loading blocks is not directly supported by Quick Fatigue Tool. However,
analysis continuation can be used to separate the load spectrum across several job files, where each
analysis represents a particular loading block, or event, in the component’s operational duty.
Consider the loading profile for a given component:
Block #
Descriptor
1
Applied load in 1-direction; fully-reversed [1, -1] load history; 3 repeats
2
Applied load in 2-direction; pulsating [0, 1] load history; 10 repeats
3
Applied load in 1-direction; mixed [1, -1, 2, -1] load history; 1 repeat
Each block is defined as a separate job file, along with their own definitions for the DATASET, HISTORY
and REPEATS job file options.
94
4.8.4 Additional guidance
Mismatching models
When superimposing results onto a previous analysis, Quick Fatigue Tool searches for matching item
IDs between the two field output files. Field data at items which match with the previous file are
superimposed to create the field data for the combined result. Items which do not match with the
previous job are appended onto the end of the field output file. This is advantageous, since the loading
in each block does not have to be applied to the same region between analyses; the definition of
DATASET does not have to be consistent.
Such a scenario is illustrated by Figure 4.21.
After running Job A, fatigue results are written to the left-hand portion of the model. After running
Job B with CONTINUE_FROM='Job A', fatigue results common to the analysis region from Job A are
superimposed onto the previous data. Fatigue results corresponding to the right-hand portion of the
model are appended to the field data without superimposition. This is illustrated by Figure 4.22. Note
how the results at the left-hand side have changed to account for the second loading block from
Job B.
Figure 4.21: FE model split into two analysis regions (green). The first loading block is applied to the smaller region,
while the second loading block is applied to the whole model.
Figure 4.22: Fatigue results for the model depicted in Figure 4.21. Common nodes are superimposed; new nodes are
appended as additional field data
95
Changing the analysis algorithm
It is possible to use a different fatigue analysis algorithm from the previous job. This is especially
advantageous if the analysis items in the second job differ completely to those in the first job. For
example, a model containing welded features may be analysed with BS7608 near the weld seams, and
a different stress-life algorithm at other non-welded locations.
If the analysis items in both models are the same (i.e. damage values are superimposed onto previous
results), the user should not move from stress-life to strain-life. Since the stress and strain histories
calculated by the strain-life methodology are a function of all previously calculated inelastic strains,
the correct damage parameter cannot be obtained if the preceding fatigue history is elastic.
Updating the material state
When a strain-based algorithm is selected for fatigue analysis, Quick Fatigue Tool automatically saves
the final material state. If analysis continuation is used with another strain-based procedure, the load
history of the second job is automatically adjusted to ensure that material hysteresis is preserved.
This feature is enabled from the environment file.
Environment file usage:
Variable
Value
importMaterialState
{0.0 | 1.0}
96
4.8.5 Limitations
Analysis continuation is subject to the following limitations:
the load equivalency (LOAD_EQ) must be the same for all jobs;
the design life (DESIGN_LIFE) must be the same for all jobs;
the results position must be the same for all jobs;
the number of cycles quoted in the log file only applies to the most recent analysis;
the field output file must be located in the default directory according to the job name.
Renaming of files or folders will prevent Quick Fatigue Tool from locating the necessary files;
the analysis will crash if the previous field output file is opened by an external process during
the fatigue analysis;
load transitions are not supported. The cycle counting algorithm will treat each block as a
separate event, therefore the effect of large cycles in previous blocks will not be taken into
effect in subsequent blocks. If the yield calculation is active over multiple blocks, the effect of
hardening is not imported to the next job, rather, the stress-strain state will be reset to zero
at the beginning of each block;
the values of L and LL are capped at the constant amplitude endurance limit of the material
corresponding to the second analysis;
virtual strain gauge definitions are not carried forward to subsequent analyses;
combining stress-based and strain-based algorithms is not permitted;
if ODB element/node sets are written to the .odb file after each analysis, the set names must
be unique between jobs, otherwise the ODB interface will exit with an error;
while the ODB interface can handle collapsed elements, results from these elements cannot
be superimposed onto previous field data, since Quick Fatigue Tool is unable to resolve the
ambiguity caused by duplicate position labels; and
analysis continuation is not supported for composite criteria analysis.
97
4.9 Virtual strain gauges
4.9.1 Overview
Virtual strain gauges are used to assess the behavior of calculated load histories compared to
measured strain data, as a means to validate the input stresses. In FEA, strain gauges can be difficult
to model and usually require the definition of axial connector elements positioned at strategic
locations on the mesh surface which correspond to the laboratory test. Quick Fatigue Tool offers the
specification of a virtual strain gauge, which is a location on the model (integration point, node, etc.)
at which the strain history is measured in a particular direction.
4.9.2 Gauge definition
The virtual strain gauge has a rectangular rosette format, an example of which is shown in Figure 4.23.
The gauge is represented in Cartesian space in Figure 4.24, with the rosette arms , and orientated
according to the angles , and , counter clockwise from the positive x-direction.
Figure 4.23: Typical layout of a rectangular rosette gauge
α
β
γ
x
y
Figure 4.24: Rosette gauge orientation relative to Cartesian axes
98
4.9.3 Technical background
The calculation of the strain histories is based on the linear elastic stresses defined in the stress data
set file. If the material contains values of the cyclic strain hardening coefficient and exponent ( and
, respectively) then the elastic stresses are first converted to elasto-plastic strain histories. These
histories are then resolved onto the directions of the gauge arms according to Equations 4.13-15.



[4.13]



[4.14]



[4.15]
The plasticity correction uses the same algorithm as the Multiaxial Gauge Fatigue app and is described
in the document Quick Fatigue Tool Appendices: A3.2.4.
If cyclic data is not provided, then the strains are calculated from the stresses elastically, according to
Equations 4.16-18.

[4.16]

[4.17]

[4.18]
4.9.4 Specifying the position of the strain gauge
Virtual strain gauges are defined by specifying the position IDs identifying the location of the gauges
to on the model.
Job file usage:
Option
Value
GAUGE_LOCATION
{'<mainID>.<subID>1',…, '<mainID>.<subID>n'}
For uniaxial analyses, the gauge location is simply '1.1'. The user may specify multiple strain gauges by
listing the position IDs as separate strings.
99
4.9.5 Specifying the orientation of the strain gauge
The strain gauge orientation is defined by providing the values of , and for gauges to ,
according to Figure 4.23.
Job file usage:
Option
Value
GAUGE_ORIENTATION
{,…, }
Alternatively, the user may use the flags 'RECTANGULAR' or 'DELTA' to indicate that the gauge has a
rectangular [] or delta [] layout, respectively.
Job file usage:
Option
Value
GAUGE_ORIENTATION
{'RECTANGULAR' | 'DELTA'}
4.9.6 Example usage
Consider the model in Figure 4.25. A gauge is to be defined at element , node . The gauge is
aligned with the global x-axis, giving an orientation of ,  and . The virtual gauge
is defined in the job file.
Figure 4.25: Virtual gauge location on FE model
100
Job file usage:
Option
Value
GAUGE_LOCATION
{'657.7'}
GAUGE_ORIENTATION
{[0.0, 45.0, 45.0]}
Additional gauges can be added by appending position IDs and orientations as necessary.
Results for each gauge are written to a text file and stored in Project\output\<jobName>\Data Files.
4.9.7 Modelling guidance
Virtual strain gauges are intended for components in a state of plane strain. The gauge will only detect
the two-dimensional state of strain relative to the global x-y plane. The user is therefore advised to
specify the gauges on plane stress elements to facilitate the definition. If the model contains solid
elements, a skin should be applied over the surface (a layer of shell elements), and the gauges defined
on these elements.
The position IDs used to define the gauges should be consistent with the element position used to
generate the stress data set file. For example, if the stresses were exported at integration points or
element nodes, both the main and sub IDs are required. For centroidal and unique nodal data, the
item is defined by the main ID; the sub ID always has a value of .
Output from virtual strain gauges can be used as input to the Multiaxial Gauge Fatigue application
(Quick Fatigue Tool Appendices: A3.2). The user must ensure that the orientations specified by
GAUGE_ORIENTATION match those defined in the application.
If the user wishes only to extract virtual strain gauge histories, the fatigue analysis can be omitted by
setting DATA_CHECK=1 in the job file. Alternatively, the Virtual Strain Gauge application
(Quick Fatigue Tool Appendices: A3.4) can be used to generate strain gauge histories from a strain
tensor definition; both methods use the same underlying analysis technique discussed in this section.
4.9.8 Limitations
Virtual strain gauges convert the linear elastic stresses to elasto-plastic strains using a simple
multilinear hardening rule. The correction is applied separately to each strain component and as such,
results will be inaccurate where a large amount of plasticity is present in the material. The virtual
strain gauge should be used as a rough estimate of the local nonlinear strain history only; in cases
where a more thorough treatment is required, the user is advised to define the gauge directly on the
finite element model.
101
5. Materials
5.1 Background
5.1.1 Overview
This section describes how to define and use material properties for analysis in Quick Fatigue Tool. In
general, this involves:
creating material properties interactively with the Material Manager application, or from a
text file; and
specifying the material in the job file.
Material data is stored as a MATLAB binary (.mat) file and is located by default in the directory
Data\material\local.
5.1.2 Accessing the Material Manager
The Material Manager application is used to create and edit material data. It can be accessed either
from the command line or by installing the Material Manager GUI application.
To install the Material Manager, double-click the file Material Manager.mlappinstall in the
Application_Files\toolbox directory. The app will appear in the apps bar in MATLAB.
Figure 5.1 shows the Material Manager GUI.
The Material Manager GUI is launched by doing one of the following:
select the Material Manager launch icon from the APPS ribbon; or
execute the command material.manage from the MATLAB command line.
Figure 5.1: Material Manager GUI
102
5.2 Material databases
5.2.1 Overview
Material Manager separates material data into two databases:
Local
Local copies of materials are stored here
Materials in this database can be
modified
Materials in this database can be used
for analysis
System
Database containing materials included
with the Quick Fatigue Tool application
Materials in this database cannot be
modified
Materials in this database must be
fetched in order to be used for analysis
If the full path to the material .mat file is specified in the job file, Quick Fatigue Tool will search for the
material in this location only. If the material is given without a path, Quick Fatigue Tool will search for
the material in the following order:
1. Local material database
2. Default local database path (<quick-fatigue-tool-root>\Data\material\local)
3. MATLAB search path (first encounter)
The local material database is the work directory used by the Material Manager application for storing
material data.
5.2.2 Specifying the local material database
When Material Manager is started for the first time, the user is prompted to specify the